Here’s Stephen Hawking’s Final Theory About the Big Bang

In honor of Dr. Stephen Hawking, the COSMOS center will be creating the most detailed 3D mapping effort of the Universe to date. Credit: BBC, Illus.: T.Reyes

Stephen Hawking is rightly seen as one of the most influential scientists of our time. In his time on this planet, the famed physicist, science communicator, author and luminary became a household name, synonymous with the likes of Einstein, Newton and Galileo. What is even more impressive is the fact that he managed to maintain his commitment to science, education and humanitarian efforts despite suffering from a slow, degenerative disease.

Even though Hawking recently passed away, his influence is still being felt. Shortly before his death, Hawking submitted a paper offering his final theory on the origins of the Universe. The paper, which was published earlier this week (on Wednesday, May 2nd), offers a new take on the Big Bang Theory that could revolutionize the way we think of the Universe, how it was created, and how it evolved.

The paper, titled “A smooth exit from eternal inflation?“, was published in the Journal of High Energy Physics. The theory was first announced at a conference at the University of Cambridge in July of last year, where Professor Thomas Hertog (a Belgian physicist at KU Leuven University) shared Hawking’s paper (which Hertog co-authored) on the occasion of his 75th birthday.

Stephen Hawking’s final theory on the Big Bang, submitted shortly before he passed away, was recently published. Credit: University of Cambridge

According to the current scientific consensus, all of the current and past matter in the Universe came into existence at the same time – roughly 13.8 billion years ago. At this time, all matter was compacted into a very small ball with infinite density and intense heat. Suddenly, this ball started to inflate at an exponential rate, and the Universe as we know it began.

However, it is widely believed that since this inflation started, quantum effects will keep it going forever in some regions of the Universe. This means that globally, the Universe’s inflation is eternal. In this respect, the observable part of our Universe (measuring 13.8 billion light-years in any direction) is just a region in which inflation has ended and stars and galaxies formed.

As Hawking explained in an interview with Cambridge University last autumn:

“The usual theory of eternal inflation predicts that globally our universe is like an infinite fractal, with a mosaic of different pocket universes, separated by an inflating ocean. The local laws of physics and chemistry can differ from one pocket universe to another, which together would form a multiverse. But I have never been a fan of the multiverse. If the scale of different universes in the multiverse is large or infinite the theory can’t be tested. ”

In their new paper, Hawking and Hertog offer a new theory that predicts that the Universe is not an infinite fractal-like multiverse, but is finite and reasonably smooth. In short, they theorize that the eternal inflation, as part of the theory of the Big Bang, is wrong. As Hertog explained:

“The problem with the usual account of eternal inflation is that it assumes an existing background universe that evolves according to Einstein’s theory of general relativity and treats the quantum effects as small fluctuations around this. However, the dynamics of eternal inflation wipes out the separation between classical and quantum physics. As a consequence, Einstein’s theory breaks down in eternal inflation.”

In contrast to this, Hawking and Hertog offer an explanation based on String Theory, a branch of theoretical physics that attempts to unify General Relativity with quantum physics. This theory was proposed to explain how gravity interacts with the three other fundamental forces of the Universe (weak and strong nuclear forces and electromagnetism), thus producing a Theory of Everything (ToE).

To put it simply, this theory describes the fundamental constituents of the Universe as tiny, one-dimensional vibrating strings. Hawking and Hertog’s approach uses the holography concept of string theory, which postulates that the Universe is a large and complex hologram. In this theory, physical reality in certain 3D spaces can be mathematically reduced to 2D projections on a surface.

 

This illustration shows the evolution of the Universe, from the Big Bang on the left, to modern times on the right. Image: NASA

Together, Hawking and Hertog developed a variation of this concept to project out the dimension of time in eternal inflation. This enabled them to describe eternal inflation without having to rely on General Relativity, thus reducing inflation to a timeless state defined on a spatial surface at the beginning of time. In this respect, the new theory represents a change from Hawking’s earlier work on “no boundary theory”.

Also known as the Hartle and Hawking No Bounary Proposal, this theory viewed the Universe like a quantum particle – assigning it a wave function that described all possible Universes. This theory also predicted that if you go back in time to the beginning of the Universe, it would shrink and close off like a sphere. Lastly, it predicted that the Universe would eventually stop expanding and collapse in on itself.

As Hertog explains, this new theory is a departure from that earlier work:

“When we trace the evolution of our universe backwards in time, at some point we arrive at the threshold of eternal inflation, where our familiar notion of time ceases to have any meaning. Now we’re saying that there is a boundary in our past.”

Using this theory, Hawking and Hertog were able to derive more reliable predictions about the global structure of the Universe. In addition, a Universe predicted to emerge from eternal inflation on the past boundary is also finite and much simpler. Last, but not least, the theory is more predictive and testable than the infinite Multiverse predicted by the old theory of eternal inflation.

 

In February 2016, LIGO detected gravity waves for the first time. As this artist's illustration depicts, the gravitational waves were created by merging black holes. The third detection just announced was also created when two black holes merged. Credit: LIGO/A. Simonnet.
Artist’s impression of merging binary black holes. Credit: LIGO/A. Simonnet.

“We are not down to a single, unique universe, but our findings imply a significant reduction of the multiverse, to a much smaller range of possible universes,” said Hawking. In theory, a finite and smooth Universe is one we can observe (at least locally) and will be governed by physical laws that we are already familiar with. Compared to an infinite number of Universes governed by different physical laws, it certainly simplifies the math!

Looking ahead, Hertog plans to study the implications of this theory on smaller scales using data obtained by space telescopes about the local Universe. In addition, he hopes to take advantage of recent studies concerning gravitational waves (GWs) and the many events that have been detected. Essentially, Hertog believes that primordial GWs generated at the exit from eternal inflation are the most promising means to test the model.

Due to the expansion of our Universe since the Big Bang, these GWs would have very long wavelengths,  ones which are outside the normal range of the Laser Interferometry Gravitational-Wave Observatory‘s (LIGO) or Virgo‘s detectors. However, the Laser Interferometry Space Antenna (LISA) – an ESA-led plan for a space-based gravitational wave observatory – and other future experiments may be capable of measuring them.

Even though he is longer with us, Hawking’s final theory could be his profound contribution to science. If future research should prove him correct, then Hawking will have resolved one of the most daunting problems in modern astrophysics and cosmology. Just one more achievement from a man who spent his life changing how people think about the Universe!

Further Reading: University of Cambridge

Every Time Lightning Strikes, Matter-Antimatter Annihilation Happens too

A Kyoto University-based team has unraveled the mystery of gamma-ray emission cascades caused by lightning strikes. Credit: Kyoto University/Teruaki Enoto

Lighting has always been a source of awe and mystery for us lowly mortals. In ancient times, people associated it with Gods like Zeus and Thor, the fathers of the Greek and Norse pantheons. With the birth of modern science and meteorology, lighting is no longer considered the province of the divine. However, this does not mean that the sense of mystery it carries has diminished one bit.

For example, scientists have found that lightning occurs in the atmospheres of other planets, like the gas giant Jupiter (appropriately!) and the hellish world of Venus. And according to a recent study from Kyoto University, gamma rays caused by lighting interact with air molecules, regularly producing radioisotopes and even positrons – the antimatter version of electrons.

The study, titled “Photonuclear Reactions Triggered by Lightning Discharge“, recently appeared in the scientific journal Nature. The study was led by Teruaki Enoto, a researcher from The Hakubi Center for Advanced Research at Kyoto University, and included members from the University of Tokyo, Hokkaido University, Nagoya University, the RIKEN Nishina Center, the MAXI Team, and the Japan Atomic Energy Agency.

For some time, physicists have been aware that small bursts of high-energy gamma rays can be produced by lightning storms – what are known as “terrestrial gamma-ray flashes”. They are believed to be the result of static electrical fields accelerating electrons, which are then slowed by the atmosphere. This phenomenon was first discovered by space-based observatories, and rays of up to 100,000 electron volts (100 MeV) have been observed.

Given the energy levels involved, the Japanese research team sought to examine how these bursts of gamma rays interact with air molecules. As Teruaki Enoto from Kyoto University, who leads the project, explained in a Kyoto University press release:

“We already knew that thunderclouds and lightning emit gamma rays, and hypothesized that they would react in some way with the nuclei of environmental elements in the atmosphere. In winter, Japan’s western coastal area is ideal for observing powerful lightning and thunderstorms. So, in 2015 we started building a series of small gamma-ray detectors, and placed them in various locations along the coast.”

Unfortunately, the team ran into funding problems along the way. As Enoto explained, they decided to reach out to the general public and established a crowdfunding campaign to fund their work. “We set up a crowdfunding campaign through the ‘academist’ site,” he said, “in which we explained our scientific method and aims for the project. Thanks to everybody’s support, we were able to make far more than our original funding goal.”

Thanks to the success of their campaign, the team built and installed particle detectors across the northwest coast of Honshu. In February of 2017, they installed four more detectors in Kashiwazaki city, which is a few hundred meters away from the neighboring town of Niigata. Immediately after the detectors were installed, a lightning strike took place in Niigata, and the team was able to study it.

What they found was something entirely new and unexpected. After analyzing the data, the team detected three distinct gamma-ray bursts of varying duration. The first was less than a millisecond long, the second was gamma ray-afterglow that took several milliseconds to decay, and the last was a prolonged emission lasting about one minute. As Enoto explained:

“We could tell that the first burst was from the lightning strike. Through our analysis and calculations, we eventually determined the origins of the second and third emissions as well.”

They determined that the second afterglow was caused by the lightning reacting with nitrogen in the atmosphere. Essentially, gamma rays are capable of causing nitrogen molecules to lose a neutron, and it was the reabsorption of these neutrons by other atmospheric particles that produced the gamma-ray afterglow. The final, prolonged emission was the result of unstable nitrogen atoms breaking down.

It was here that things really got interesting. As the unstable nitrogen broke down, it released positrons that then collided with electrons, causing matter-antimatter annihilations that released more gamma rays. As Enoto explained, this demonstrated, for the first time that antimatter is something that can occur in nature due to common mechanisms.

“We have this idea that antimatter is something that only exists in science fiction,” he said. “Who knew that it could be passing right above our heads on a stormy day? And we know all this thanks to our supporters who joined us through ‘academist’. We are truly grateful to all.”

If these results are indeed correct, than antimatter is not the extremely rare substance that we tend to think it is. In addition, the study could present new opportunities for high-energy physics and antimatter research. All of this research could also lead to the development of new or refined techniques for creating it.

Looking ahead, Enoto and his team hopes to conduct more research using the ten detectors they still have operating along the coast of Japan. They also hope to continue involving the public with their research, a process that goes far beyond crowdfunding and includes the efforts of citizen scientists to help process and interpret data.

Further Reading: University of Kyoto, Nature, NASA Goddard Media Studios

Hallelujah, It’s Raining Diamonds! Just like the Insides of Uranus and Neptune.

An experiment conducted by an international team of scientists recreated the "diamond rain" believed to exist in the interiors of ice giants like Uranus and Neptune. Credit: Greg Stewart/SLAC National Accelerator Laboratory

For more than three decades, the internal structure and evolution of Uranus and Neptune has been a subject of debate among scientists. Given their distance from Earth and the fact that only a few robotic spacecraft have studied them directly, what goes on inside these ice giants is still something of a mystery. In lieu of direct evidence, scientists have relied on models and experiments to replicate the conditions in their interiors.

For instance, it has been theorized that within Uranus and Neptune, the extreme pressure conditions squeeze hydrogen and carbon into diamonds, which then sink down into the interior. Thanks to an experiment conducted by an international team of scientists, this “diamond rain” was recreated under laboratory conditions for the first time, giving us the first glimpse into what things could be like inside ice giants.

The study which details this experiment, titled “Formation of Diamonds in Laser-Compressed Hydrocarbons at Planetary Interior Conditions“, recently appeared in the journal Nature Astronomy. Led by Dr. Dominik Kraus, a physicist from the Helmholtz-Zentrum Dresden-Rossendorf Institute of Radiation Physics, the team included members from the SLAC National Accelerator Laboratory, the Lawrence Livermore National Laboratory and UC Berkeley.

Uranus and Neptune, the Solar System’s ice giant planets. Credit: Wikipedia Commons

For decades, scientists have held that the interiors of planets like Uranus and Neptune consist of solid cores surrounded by a dense concentrations of “ices”. In this case, ice refers to hydrogen molecules connected to lighter elements (i.e. as carbon, oxygen and/or nitrogen) to create compounds like water and ammonia. Under extreme pressure conditions, these compounds become semi-solid, forming “slush”.

And at roughly 10,000 kilometers (6214 mi) beneath the surface of these planets, the compression of hydrocarbons is thought to create diamonds. To recreate these conditions, the international team subjected a sample of polystyrene plastic to two shock waves using an intense optical laser at the Matter in Extreme Conditions (MEC) instrument, which they then paired with x-ray pulses from the SLAC’s Linac Coherent Light Source (LCLS).

As Dr. Kraus, the head of a Helmholtz Young Investigator Group at HZDR, explained in an HZDR press release:

“So far, no one has been able to directly observe these sparkling showers in an experimental setting. In our experiment, we exposed a special kind of plastic – polystyrene, which also consists of a mix of carbon and hydrogen – to conditions similar to those inside Neptune or Uranus.”

The plastic in this experiment simulated compounds formed from methane, a molecule that consists of one carbon atom bound to four hydrogen atoms. It is the presence of this compound that gives both Uranus and Neptune their distinct blue coloring. In the intermediate layers of these planets, it also forms hydrocarbon chains that are compressed into diamonds that could be millions of karats in weight.

The MEC hutch of SLAC’s LCLS Far Experiement Hall. Credit: SLAC National Accelerator Laboratory

The optical laser the team employed created two shock waves which accurately simulated the temperature and pressure conditions at the intermediate layers of Uranus and Neptune. The first shock was smaller and slower, and was then overtaken by the stronger second shock. When they overlapped, the pressure peaked and tiny diamonds began to form. At this point, the team probed the reactions with x-ray pulses from the LCLS.

This technique, known as x-ray diffraction, allowed the team to see the small diamonds form in real-time, which was necessary since a reaction of this kind can only last for fractions of a second. As Siegfried Glenzer, a professor of photon science at SLAC and a co-author of the paper, explained:

“For this experiment, we had LCLS, the brightest X-ray source in the world. You need these intense, fast pulses of X-rays to unambiguously see the structure of these diamonds, because they are only formed in the laboratory for such a very short time.”

In the end, the research team found that nearly every carbon atom in the original plastic sample was incorporated into small diamond structures. While they measured just a few nanometers in diameter, the team predicts that on Uranus and Neptune, the diamonds would be much larger. Over time, they speculate that these could sink into the planets’ atmospheres and form a layer of diamond around the core.

The interior structure of Neptune. Credit: Moscow Institute of Physics and Technology

In previous studies, attempts to recreate the conditions in Uranus and Neptune’s interior met with limited success. While they showed results that indicated the formation of graphite and diamonds, the teams conducting them could not capture the measurements in real-time. As noted, the extreme temperatures and pressures that exist within gas/ice giants can only be simulated in a laboratory for very short periods of time.

However, thanks to LCLS – which creates X-ray pulses a billion times brighter than previous instruments and fires them at a rate of about 120 pulses per second (each one lasting just quadrillionths of a second) – the science team was able to directly measure the chemical reaction for the first time. In the end, these results are of particular significance to planetary scientists who specialize in the study of how planets form and evolve.

As Kraus explained, it could cause to rethink the relationship between a planet’s mass and its radius, and lead to new models of planet classification:

“With planets, the relationship between mass and radius can tell scientists quite a bit about the chemistry. And the chemistry that happens in the interior can provide additional information about some of the defining features of the planet… We can’t go inside the planets and look at them, so these laboratory experiments complement satellite and telescope observations.”

This experiment also opens new possibilities for matter compression and the creation of synthetic materials. Nanodiamonds currently have many commercial applications – i.e. medicine, electronics, scientific equipment, etc, – and creating them with lasers would be far more cost-effective and safe than current methods (which involve explosives).

Fusion research, which also relies on creating extreme pressure and temperature conditions to generate abundant energy, could also benefit from this experiment. On top of that, the results of this study offer a tantalizing hint at what the cores of massive planets look like. In addition to being composed of silicate rock and metals, ice giants may also have a diamond layer at their core-mantle boundary.

Assuming we can create probes of sufficiently strong super-materials someday, wouldn’t that be worth looking into?

Further Reading: SLAC, HZDR, Nature Astronomy

 

Experiment Detects Mysterious Neutrino-Nucleus Scattering For the First Time

The Spallation Neutron Source, located at the Oak Ridge National Laboratory. Credit: neutrons.ornl.gov

Neutrinos are one of the fundamental particles that make up the Universe. Compared to other types of particles, they have very little mass, no charge, and only interact with others via the weak nuclear force and gravity. As such, finding evidence of their interactions is extremely difficult, requiring massive instruments located deep underground to shield them from any interference.

However, using the Spallation Neutron Source (SNS), a research facility located at the Oak Ridge National Laboratory (ORNL) – an international team of researchers recently made a historic discovery about neutrinos using an entirely different method. As part of the COHERENT experiment, these results confirm a prediction made 43 years ago and offers new possibilities for neutrino research.

Continue reading “Experiment Detects Mysterious Neutrino-Nucleus Scattering For the First Time”

Physicists Take Big Step Towards Quantum Computing and Encryption with new Experiment

Artist’s concept of the experiment in which two atoms are being entangled over a distance of 400 meters. Credit: Wenjamin Rosenfeld

Quantum entanglement remains one of the most challenging fields of study for modern physicists. Described by Einstein as “spooky action at a distance”, scientists have long sought to reconcile how this aspect of quantum mechanics can coexist with classical mechanics. Essentially, the fact that two particles can be connected over great distances violates the rules of locality and realism.

Formally, this is a violation of Bell’s Ineqaulity, a theory which has been used for decades to show that locality and realism are valid despite being inconsistent with quantum mechanics. However, in a recent study, a team of researchers from the Ludwig-Maximilian University (LMU) and the Max Planck Institute for Quantum Optics in Munich conducted tests which once again violate Bell’s Inequality and proves the existence of entanglement.

Their study, titled “Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes“, was recently published in the Physical Review Letters. Led by Wenjamin Rosenfeld, a physicist at LMU and the Max Planck Institute for Quantum Optics, the team sought to test Bell’s Inequality by entangling two particles at a distance.

John Bell, the Irish physicist who devised a test to show that nature does not ‘hide variables’ as Einstein had proposed. Credit: CERN\

Bell’s Inequality (named after Irish physicist John Bell, who proposed it in 1964) essentially states that properties of objects exist independent of being observed (realism), and no information or physical influence can propagate faster than the speed of light (locality). These rules perfectly described the reality we human beings experience on a daily basis, where things are rooted in a particular space and time and exist independent of an observer.

However, at the quantum level, things do not appear to follow these rules. Not only can particles be connected in non-local ways over large distances (i.e. entanglement), but the properties of these particles cannot be defined until they are measured. And while all experiments have confirmed that the predictions of quantum mechanics are correct, some scientists have continued to argue that there are loopholes that allow for local realism.

To address this, the Munich team conducted an experiment using two laboratories at LMU. While the first lab was located in the basement of the physics department, the second was located in the basement of the economics department – roughly 400 meters away. In both labs, teams captured a single rubidium atom in an topical trap and then began exciting them until they released a single photon.

As Dr. Wenjamin Rosenfeld explained in an Max Planck Institute press release:

“Our two observer stations are independently operated and are equipped with their own laser and control systems. Because of the 400 meters distance between the laboratories, communication from one to the other would take 1328 nanoseconds, which is much more than the duration of the measurement process. So, no information on the measurement in one lab can be used in the other lab. That’s how we close the locality loophole.”

The experiment was performed in two locations 398 meters apart at the Ludwig Maximilian University campus in Munich, Germany. Credit: Rosenfeld et al/American Physical Society

Once the two rubidium atoms were excited to the point of releasing a photon, the spin-states of the rubidium atoms and the polarization states of the photons were effectively entangled. The photons were then coupled into optical fibers and guided to a set-up where they were brought to interference. After conducting a measurement run for eight days, the scientists were able to collected around 10,000 events to check for signs entanglement.

This would have been indicated by the spins of the two trapped rubidium atoms, which would be pointing in the same direction (or in the opposite direction, depending on the kind of entanglement). What the Munich team found was that for the vast majority of the events, the atoms were in the same state (or in the opposite state), and that there were only six deviations consistent with Bell’s Inequality.

These results were also statistically more significant than those obtained by a team of Dutch physicists in 2015. For the sake of that study, the Dutch team conducted experiments using electrons in diamonds at labs that were 1.3 km apart. In the end, their results (and other recent tests of Bell’s Inequality) demonstrated that quantum entanglement is real, effectively closing the local realism loophole.

As Wenjamin Rosenfeld explained, the tests conducted by his team also went beyond these other experiments by addressing another major issue. “We were able to determine the spin-state of the atoms very fast and very efficiently,” he said. “Thereby we closed a second potential loophole: the assumption, that the observed violation is caused by an incomplete sample of detected atom pairs”.

By obtaining proof of the violation of Bell’s Inequality, scientists are not only helping to resolve an enduring incongruity between classical and quantum physics. They are also opening the door to some exciting possibilities. For instance, for years, scientist have anticipated the development of quantum processors, which rely on entanglements to simulate the zeros and ones of binary code.

Computers that rely on quantum mechanics would be exponentially faster than conventional microprocessors, and would ushering in a new age of research and development. The same principles have been proposed for cybersecurity, where quantum encryption would be used to cypher information, making it invulnerable to hackers who rely on conventional computers.

Last, but certainly not least, there is the concept of Quantum Entanglement Communications, a method that would allow us to transmit information faster than the speed of light. Imagine the possibilities for space travel and exploration if we are no longer bound by the limits of relativistic communication!

Einstein wasn’t wrong when he characterized quantum entanglements as “spooky action”. Indeed, much of the implications of this phenomena are still as frightening as they are fascinating to physicists. But the closer we come to understanding it, the closer we will be towards developing an understanding of how all the known physical forces of the Universe fit together – aka. a Theory of Everything!

Further Reading: LMU, Physical Review Letters

This is the Strangest Idea Ever for a Spacecraft Propulsion System: Ferrofluids

A ferrofluid is a magnetic liquid that turns spiky in a magnetic field. Add an electric field and each needle-like spike emits a jet of ions, which could solve micropropulsion for nanosatellites in space. Credit: MTU

When it comes to the future of space exploration, some truly interesting concepts are being developed. Hoping to reach farther and reduce associated costs, one of the overarching goals is to find more fuel-efficient and effective means of sending robotic spacecraft, satellites and even crewed missions to their destinations. Towards this end, ideas like nuclear propulsion, ion engines and even antimatter are all being considered.

But this idea has to be the strangest one to date! It’s known as a ferrofluid thruster, a new concept that relies on ionic fluids that become strongly magnetized and release ions when exposed to a magnetic field. According to a new study produced by researchers from the Ion Space Propulsion Laboratory at Michigan Tech, this concept could very well be the future of satellite propulsion.

This study, which was recently published in the journal Physics of Fluids, presents an entirely new method for creating microthrusters – tiny nozzles that are used by small satellites to maneuver in orbit. Thanks to improvements in technology, small satellites – which are typically defined as those that weight less than 500 km (1,100 lbs) – can perform tasks that were once reserved for larger ones.

As the magnetic field is applied, the ferrofluid forms “peaks”, which disappear once the field is removed. Click to animate. Credit: MTU

As such, they are making up an increasingly large share of the satellite market, and many more are expected to be launched in the near future. In fact, it is estimated that between 2015 and 2019, over 500 small satellites will be launched to LEO, with an estimated market value of $7.4 billion. Little wonder then why researchers are looking at various types of microthrusters to ensure that these satellites can maneuver effectively.

While there are no shortage of possibilities, finding the one that balances cost-effectiveness and reliability has been difficult. To address this, an MTU research team began conducting a study that considered ferrofluids as a possible solution. As noted, ferrofluids are ionic liquids that become active when exposed to a magnetic field, forming peaks that emit small amounts of ions.

These peaks then return to a natural state when the magnetic field is removed, a phenomena known as Rosenweig instability. Led by Brandon A. Jackson – a doctoral candidate in mechanical engineering at Michigan Technological University – the MTU research team began to consider how this could be turned into propulsion. Other members included fellow doctoral candidate Kurt Terhune and Professor Lyon B. King.

Prof. King, the Ron & Elaine Starr Professor in Space Systems at Michigan Tech, has been researching the physics of ferrofluids for many years, thanks to support provided by the Air Force Office of Scientific Research (AFOSR). In 2012, he proposed using such ionic fluids to create a microthruster for modern satellites, based on previous studies conducted by researchers at the University of Sydney.

Without a magnetic field, ferrofluids look like a tarry, oil-based fuel. With a magnetic field, the propellant self-assembles, raising into a spiky ball. Credit: MTU

As he explained in a MTU press release, this method offers a simple and effective way to create a reliable microthruster:

“We’re working with a unique material called an ionic liquid ferrofluid. When we put a magnet underneath a small pool of the ferrofluid, it turns into a beautiful hedgehog structure of aligned peaks. When we apply a strong electric field to that array of peaks, each one emits an individual micro-jet of ions.”

With King’s help, who oversees MTU’s Ion Space Propulsion Laboratory, Jackson and Tehrune began conducting an an experimental and computational study on the dynamics of the ferrofluid. From this, they created a computational model that taught them much about the relationships between magnetic, electric and surface tension stresses, and were even surprised by some of what they saw.

“We wanted to learn what led up to emission instability in one single peak of the ferrofluid microthruster,” said Jackson. “We learned that the magnetic field has a large effect in preconditioning the fluid electric stress.”

Cubesats being launched from the International Space Station. Credit: NASA

Ultimately, what they had created was a model for an electrospray ionic liquid ferrofluid thruster. Unlike conventional electrospray thrusters – which generate propulsion with electrical charges that send tiny jets of fluid through microscopic needles – a ferrofluid electrospray thruster would be able to do away with these needles, which are expensive to manufacture and vulnerable to damage.

Instead, the thruster they are proposing would be able to assemble itself out of its own propellant, would rely on no fragile parts, and would essentially be indestructible. It would also present advantages over conventional plasma thrusters, which are apparently unreliable when scaled down for small satellites. With the success of their model, the AFOSR recently decided to award King a second contract to continue studying ferrofluids.

With this funding secured, King is confident that they can put what they learned with this study to good use, and scale it up to examine what happens with multiple peaks. As he explained:

“Often in the lab we’ll have one peak working and 99 others loafing. Brandon’s model will be a vital tool for the team going forward. If we are successful, our thruster will enable small inexpensive satellites with their own propulsion to be mass produced. That could improve remote sensing for better climate modeling, or provide better internet connectivity, which three billion people in the world still do not have.”

In the coming years, small satellites are expected to make up an ever-increasing portion of all artificial objects that are currently in Low Earth Orbit. Credit: ESA

Looking ahead, the team wants to conduct experiments on how an actual thruster might perform. The team has also begun working with Professor Juan Fernandez de la Mora of Yale University, one of the world’s leading experts on electrospray propulsion, to help bring their proposal to fruition. Naturally, it will take many years before a prototype is ready, and such a thruster would likely have to be able to execute about 100 peaks to be considered viable.

Nevertheless, the technology holds promise for a market that is expected to grow by leaps and bounds in the coming years and decades. Facilitating everything from worldwide internet access and telecommunications to scientific research, there is likely to be no shortage of smallsats, cubesats, nanosats, etc. taking to space very soon. They will all need to have reliable propulsion if they want to be able to stay clear of each other do their jobs!

Michigan Tech also has patents pending for the technology, which has applications that go beyond propulsion to include spectrometry, pharmaceuticals, and nanofabrication.

Further Reading: MTU, Physics of Fluids

We’re One Step Closer to Knowing Why There’s More Matter Than Antimatter in the Universe

Credit: Univeristy of Tokyo

The Standard Model of particle physics has been the predominant means of explaining what the basic building blocks of matter are and how they interact for decades. First proposed in the 1970s, the model claims that for every particle created, there is an anti-particle. As such, an enduring mystery posed by this model is why the Universe can exist if it is theoretically made up of equal parts of matter and antimatter.

This seeming disparity, known as the charge-parity (CP) violation, has been the subject of experiments for many years. But so far, no definitive demonstration has been made for this violation, or how so much matter can exist in the Universe without its counterpart. But thanks to new findings released by the international Tokai-to-Kamioka (T2K) collaboration, we may be one step closer to understanding why this disparity exists.

First observed in 1964, CP violation proposes that under certain conditions, the laws of charge-symmetry and parity-symmetry (aka. CP-symmetry) do not apply. These laws state that the physics governing a particle should be the same if it were interchanged with its antiparticle, while its spatial coordinates would be inverted. From this observation, one of the greatest cosmological mysteries emerged.

If the laws governing matter and antimatter are the same, then why is it that the Universe is so matter-dominated? Alternately, if matter and antimatter are fundamentally different, then how does this accord with our notions of symmetry? Answering these questions is not only important as far as our predominant cosmological theories go, they are also intrinsic to understanding how the weak interactions that govern particles work.

Established in June of 2011, the international T2K collaboration is the first experiment in the world dedicated to answering this mystery by studying neutrino and anti-neutrino oscillations. The experiment begins with high-intensity beams of muon neutrinos (or muon anti-neutrinos) being generated at the Japan Proton Accelerator Research Complex (J-PARC), which are then fired towards the Super-Kamiokande detector 295 km away.

This detector is currently one of the world’s largest and most sophisticated, dedicated to the detection and study of solar and atmospheric neutrinos. As neutrinos travel between the two facilities, they change “flavor” – going from muon neutrinos or anti-neutrinos to electron neutrinos or anti-neutrinos. In monitoring these neutrino and anti-neutrino beams, the experiment watches for different rates of oscillation.

This difference in oscillation would show that there is an imbalance between particles and antiparticles, and thus provide the first definitive evidence of CP violation for the first time. It would also indicate that there are physics beyond the Standard Model that scientists have yet to probe. This past April, the first data set produced by T2K was released, which provided some telling results.

The detected pattern of an electron neutrino candidate event observed by Super-Kamiokande. Credit: Kavli IMPU

As Mark Hartz, a T2K collaborator and the Kavli IPMU Project Assistant Professor, said in a recent press release:

“While the data sets are still too small to make a conclusive statement, we have seen a weak preference for large CP violation and we are excited to continue to collect data and make a more sensitive search for CP violation.”

These results, which were recently published in the Physical Review Letters, include all data runs from between January 2010 to May 2016. In total, this data comprised 7.482 x 1020 protons (in neutrino mode), which yielded 32 electron neutrino and 135 muon neutrino events, and 7.471×1020 protons (in antineutrino mode), which yielded 4 electron anti-neutrino and 66 muon neutrino events.

In other words, the first batch of data has provided some evidence for CP violation, and with a confidence interval of 90%. But this is just the beginning, and the experiment is expected to run for another ten years before wrapping up. “If we are lucky and the CP violation effect is large, we may expect 3 sigma evidence, or about 99.7% confidence level, for CP violation by 2026,” said Hartz.

If the experiment proves successful, physicists may finally be able to answer how it is that the early Universe didn’t annihilate itself. It is also likely help to reveal aspects of the Universe that particle physicists are anxious to get into! For it here that the answers to the deepest secrets of the Universe, like how all of its fundamental forces fit together, are likely to be found.

Further Reading: Kavli IMPU, Physical Review Letters

New Way to Make Plasma Propulsion Lighter and More Efficient

Image of the Neptune thruster (right) with plasma expanding into a space simulation chamber. Credit: Dmytro Rafalskyi

Plasma propulsion is a subject of keen interest to astronomers and space agencies. As a highly-advanced technology that offers considerable fuel-efficiency over conventional chemical rockets, it is currently being used in everything from spacecraft and satellites to exploratory missions. And looking to the future, flowing plasma is also being investigated for more advanced propulsion concepts, as well as magnetic-confined fusion.

However, a common problem with plasma propulsion is the fact that it relies on what is known as a “neutralizer”. This instrument, which allows spacecraft to remain charge-neutral, is an additional drain on power. Luckily, a team of researchers from the University of York and École Polytechnique are investigating a plasma thruster design that would do away with a neutralizer altogether.

A study detailing their research findings – titled “Transient propagation dynamics of flowing plasmas accelerated by radio-frequency electric fields” – was released earlier this month in Physics of Plasmas – a journal published by the American Institute of Physics. Led by Dr. James Dendrick, a physicist from the York Plasma Institute at the University of York, they present a concept for a self-regulating plasma thruster.

A 6 kW Hall thruster in operation at NASA;s Jet Propulsion Laboratory. Credit: NASA/JPL

Basically, plasma propulsion systems rely on electric power to ionize propellant gas and transform it into plasma (i.e. negatively charged electrons and positively-charged ions). These ions and electrons are then accelerated by engine nozzles to generate thrust and propel a spacecraft. Examples include the Gridded-ion and Hall-effect thruster, both of which are established propulsion technologies.

The Gridden-ion thruster was first tested in the 1960s and 70s as part of the Space Electric Rocket Test (SERT) program. Since then, it has been used by NASA’s Dawn mission, which is currently exploring Ceres in the Main Asteroid Belt. And in the future, the ESA and JAXA plan to use Gridded-iron thrusters to propel their BepiColombo mission to Mercury.

Similarly, Hall-effect thrusters have been investigated since the 1960s by both NASA and the Soviet space programs. They were first used as part of the ESA’s Small Missions for Advanced Research in Technology-1 (SMART-1) mission. This mission, which launched in 2003 and crashed into the lunar surface three years later, was the first ESA mission to go to the Moon.

As noted, spacecraft that use these thrusters all require a neutralizer to ensure that they remain “charge-neutral”. This is necessary since conventional plasma thrusters generate more positively-charged particles than they do negatively-charged ones. As such, neutralizers inject electrons (which carry a negative charge) in order to maintain the balance between positive and negative ions.

An artist's illustration of NASA's Dawn spacecraft approaching Ceres. Image: NASA/JPL-Caltech.
An artist’s illustration of NASA’s Dawn spacecraft with its ion propulsion system approaching Ceres. Credit: NASA/JPL-Caltech.

As you might suspect, these electrons are generated by the spacecraft’s electrical power systems, which means that the neutralizer is an additional drain on power. The addition of this component also means that the propulsion system itself will have to be larger and heavier. To address this, the York/École Polytechnique team proposed a design for a plasma thruster that can remain charge neutral on its own.

Known as the Neptune engine, this concept was first demonstrated in 2014 by Dmytro Rafalskyi and Ane Aanesland, two researchers from the École Polytechnique’s Laboratory of Plasma Physics (LPP) and co-authors on the recent paper. As they demonstrated, the concept builds upon the technology used to create gridded-ion thrusters, but manages to generate exhaust that contains comparable amounts of positively and negatively charged ions.

As they explain in the course of their study:

“Its design is based on the principle of plasma acceleration, whereby the coincident extraction of ions and electrons is achieved by applying an oscillating electrical field to the gridded acceleration optics. In traditional gridded-ion thrusters, ions are accelerated using a designated voltage source to apply a direct-current (dc) electric field between the extraction grids. In this work, a dc self-bias voltage is formed when radio-frequency (rf) power is coupled to the extraction grids due to the difference in the area of the powered and grounded surfaces in contact with the plasma.”

The hall-effect thruster used by the SMART-1 mission, which relied on xenon as its reaction mass. Copyright: ESA

In short, the thruster creates exhaust that is effectively charge-neutral through the application of radio waves. This has the same effect of adding an electrical field to the thrust, and effectively removes the need for a neutralizer. As their study found, the Neptune thruster is also capable of generating thrust that is comparable to a conventional ion thruster.

To advance the technology even further, they teamed up with James Dedrick and Andrew Gibson from the York Plasma Institute to study how the thruster would work under different conditions. With Dedrick and Gibson on board, they began to study how the plasma beam might interact with space and whether this would affect its balanced charge.

What they found was that the engine’s exhaust beam played a large role in keeping the beam neutral, where the propagation of electrons after they are introduced at the extraction grids acts to compensate for space-charge in the plasma beam. As they state in their study:

“[P]hase-resolved optical emission spectroscopy has been applied in combination with electrical measurements (ion and electron energy distribution functions, ion and electron currents, and beam potential) to study the transient propagation of energetic electrons in a flowing plasma generated by an rf self-bias driven plasma thruster. The results suggest that the propagation of electrons during the interval of sheath collapse at the extraction grids acts to compensate space-charge in the plasma beam.”

Naturally, they also emphasize that further testing will be needed before a Neptune thruster can ever be used. But the results are encouraging, since they offer up the possibility of ion thrusters that are lighter and smaller, which would allow for spacecraft that are even more compact and energy-efficient. For space agencies looking to explore the Solar System (and beyond) on a budget, such technology is nothing if not desirable!

Further Reading: Physics of Plasmas, AIP

Another Strange Discovery From LHC That Nobody Understands

New results from ALICE at the Large Hadron Collider show so-called strange hadrons being created where none were expected. As the number of proton-proton collisions (the blue lines) increase, the more of these strange hadrons are seen (as shown by the red squares in the graph). (Image: CERN)
New results from ALICE at the Large Hadron Collider show so-called strange hadrons being created where none were expected. As the number of proton-proton collisions (the blue lines) increase, the more of these strange hadrons are seen (as shown by the red squares in the graph). (Image: CERN)

There are some strange results being announced in the physics world lately. A fluid with a negative effective mass, and the discovery of five new particles, are all challenging our understanding of the universe.

New results from ALICE (A Large Ion Collider Experiment) are adding to the strangeness.

ALICE is a detector on the Large Hadron Collider (LHC). It’s one of seven detectors, and ALICE’s role is to “study the physics of strongly interacting matter at extreme energy densities, where a phase of matter called quark-gluon plasma forms,” according to the CERN website. Quark-gluon plasma is a state of matter that existed only a few millionths of a second after the Big Bang.

In what we might call normal matter—that is the familiar atoms that we all learn about in high school—protons and neutrons are made up of quarks. Those quarks are held together by other particles called gluons. (“Glue-ons,” get it?) In a state known as confinement, these quarks and gluons are permanently bound together. In fact, quarks have never been observed in isolation.

A cut-away view of the ALICE detector at CERN’s LHC. Image: By Pcharito – Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=31365856

The LHC is used to collide particles together at extremely high speeds, creating temperatures that can be 100,000 times hotter than the center of our Sun. In new results just released from CERN, lead ions were collided, and the resulting extreme conditions come close to replicating the state of the Universe those few millionths of a second after the Big Bang.

In those extreme temperatures, the state of confinement was broken, and the quarks and gluons were released, and formed quark-gluon plasma.

So far, this is pretty well understood. But in these new results, something additional happened. There was increased production of what are called “strange hadrons.” Strange hadrons themselves are well-known particles. They have names like Kaon, Lambda, Xi and Omega. They’re called strange hadrons because they each have one “strange quark.”

If all of this seems a little murky, here’s the dinger: Strange hadrons may be well-known particles, because they’ve been observed in collisions between heavy nuclei. But they haven’t been observed in collisions between protons.

“Being able to isolate the quark-gluon-plasma-like phenomena in a smaller and simpler system…opens up an entirely new dimension for the study of the properties of the fundamental state that our universe emerged from.” – Federico Antinori, Spokesperson of the ALICE collaboration.

“We are very excited about this discovery,” said Federico Antinori, Spokesperson of the ALICE collaboration. “We are again learning a lot about this primordial state of matter. Being able to isolate the quark-gluon-plasma-like phenomena in a smaller and simpler system, such as the collision between two protons, opens up an entirely new dimension for the study of the properties of the fundamental state that our universe emerged from.”

Enhanced Strangeness?

The creation of quark-gluon plasma at CERN provides physicists an opportunity to study the strong interaction. The strong interaction is also known as the strong force, one of the four fundamental forces in the Universe, and the one that binds quarks into protons and neutrons. It’s also an opportunity to study something else: the increased production of strange hadrons.

In a delicious turn of phrase, CERN calls this phenomenon “enhanced strangeness production.” (Somebody at CERN has a flair for language.)

Enhanced strangeness production from quark-gluon plasma was predicted in the 1980s, and was observed in the 1990s at CERN’s Super Proton Synchrotron. The ALICE experiment at the LHC is giving physicists their best opportunity yet to study how proton-proton collisions can have enhanced strangeness production in the same way that heavy ion collisions can.

According to the press release announcing these results, “Studying these processes more precisely will be key to better understand the microscopic mechanisms of the quark-gluon plasma and the collective behaviour of particles in small systems.”

I couldn’t have said it better myself.

Team Creates Negative Effective Mass In The Lab

Credit: ESA/Hubble, ESO, M. Kornmesser
Researchers at WSU have created a fluid with a negative effective mass for the first time, which could open the door to studying the deeper mysteries of the Universe. Credit: ESA/Hubble, ESO, M. Kornmesse

When it comes to objects and force, Isaac Newton’s Three Laws of Motion are pretty straightforward. Apply force to an object in a specific direction, and the object will move in that direction. And unless there’s something acting against it (like gravity or air pressure) it will keep moving in that direction until something stops it. But when it comes to “negative mass”, the exact opposite is true.

As the name would suggest, the term refers to matter whose mass is opposite that of normal matter. Until a few years ago, negative mass was predominantly a theoretical concept and had only been observed in very specific settings. But according to a recent study by an international team of researchers, they managed to create a fluid with a “negative effective mass” under laboratory conditions for the first time .

To put it in the simplest terms, matter can have a negative mass in the same way that a particle can have a negative charge. When it comes to the Universe that we know and study on a regular basis, one could say that we have encountered only the positive form of mass. In fact, one could say that it is the same situation with matter and antimatter. Theoretical physics tells us both exist, but we only see the one on a regular basis.

. Credit: shock.wsu.edu

As Dr. Michael McNeil Forbes – a Professor at Washington State University, a Fellow at the Institute for Nuclear Theory, and a co-author on the study – explained in a WSU press release:

“That’s what most things that we’re used to do. With negative mass, if you push something, it accelerates toward you. Once you push, it accelerates backwards. It looks like the rubidium hits an invisible wall.”

According to the team’s study, which was recently published in the Physical Review Letters (under the title “Negative-Mass Hydrodynamics in a Spin-Orbit–Coupled Bose-Einstein Condensate“), a negative effective mass can be created by altering the spin-orbit coupling of atoms. Led by Peter Engels – a professor of physics and astronomy at Washington State University – this consisted of using lasers to control the behavior of rubidium atoms.

They began by using a single laser to keep rubidium atoms in a bowl measuring less than 100 microns across. This had the effect of slowing the atoms down and cooling them to just a few degrees above absolute zero, which resulted in the rubidium becoming a Bose-Einstein condensate. Named after Satyendra Nath Bose and Albert Einstein (who predicted how their atoms would behave) these types of condensates behaves like a superfluid.

Velocity-distribution data (3 views) for a gas of rubidium atoms, confirming the discovery of a new phase of matter, the Bose–Einstein condensate. Credit: NIST/JILA/CU-Boulder

Basically, this means that their particles move very slowly and behave like waves, but without losing any energy. A second set of lasers was then applied to move the atoms back and forth, effectively changing the way they spin. Prior to the change in their spins, the superfluid had regular mass and breaking the bowl would result in them pushing out and expanding away from their center of mass.

But after the application of the second laser, the rubidium rushed out and accelerated in the opposite direction – consistent with how a negative mass would. This represented a break with previous laboratory experiments, where researchers were unable to get atoms to behave in a way that was consistent with negative mass. But as Forbes explained, the WSU experiment avoided some of the underlying defects encountered by these experiments:

“What’s a first here is the exquisite control we have over the nature of this negative mass, without any other complications. It provides another environment to study a fundamental phenomenon that is very peculiar.”

And while news of this experiment has been met with fanfare and claims to the effect that the researchers had “rewritten the laws of physics”, it is important to emphasize that this research has created a “negative effective mass” – which is fundamentally different from a negative mass.

Artist’s rendering of an outburst on an ultra-magnetic neutron star, also called a magnetar.
Credit: NASA/Goddard Space Flight Center

As Sabine Hossenfelder, a Research Fellow at the Frankfurt Institute for Advanced Studies, wrote on her website Backreaction in response to the news:

“Physicists use the preamble ‘effective’ to indicate something that is not fundamental but emergent, and the exact definition of such a term is often a matter of convention. The ‘effective radius’ of a galaxy, for example, is not its radius. The ‘effective nuclear charge’ is not the charge of the nucleus. And the ‘effective negative mass’ – you guessed it – is not a negative mass. The effective mass is merely a handy mathematical quantity to describe the condensate’s behavior.”

In other words, the researchers were able to get atoms to behave as a negative mass, rather than creating one. Nevertheless, their experiment demonstrates the level of control researchers now have when conducting quantum experiments, and also serves to clarify how negative mass behaves in other systems. Basically, physicists can use the results of these kinds of experiments to probe the mysteries of the Universe where experimentation is impossible.

These include what goes on inside neutron stars or what transpires beneath the veil of a event horizon. Perhaps they could even shed some light on questions relating to dark energy.

Further Reading: Physical Review Letters, WSU