Neutrinos are one of the fundamental particles that make up the Universe. Compared to other types of particles, they have very little mass, no charge, and only interact with others via the weak nuclear force and gravity. As such, finding evidence of their interactions is extremely difficult, requiring massive instruments located deep underground to shield them from any interference.
However, using the Spallation Neutron Source (SNS), a research facility located at the Oak Ridge National Laboratory (ORNL) – an international team of researchers recently made a historic discovery about neutrinos using an entirely different method. As part of the COHERENT experiment, these results confirm a prediction made 43 years ago and offers new possibilities for neutrino research.
The Standard Model of particle physics has been the predominant means of explaining what the basic building blocks of matter are and how they interact for decades. First proposed in the 1970s, the model claims that for every particle created, there is an anti-particle. As such, an enduring mystery posed by this model is why the Universe can exist if it is theoretically made up of equal parts of matter and antimatter.
This seeming disparity, known as the charge-parity (CP) violation, has been the subject of experiments for many years. But so far, no definitive demonstration has been made for this violation, or how so much matter can exist in the Universe without its counterpart. But thanks to new findings released by the international Tokai-to-Kamioka (T2K) collaboration, we may be one step closer to understanding why this disparity exists.
First observed in 1964, CP violation proposes that under certain conditions, the laws of charge-symmetry and parity-symmetry (aka. CP-symmetry) do not apply. These laws state that the physics governing a particle should be the same if it were interchanged with its antiparticle, while its spatial coordinates would be inverted. From this observation, one of the greatest cosmological mysteries emerged.
If the laws governing matter and antimatter are the same, then why is it that the Universe is so matter-dominated? Alternately, if matter and antimatter are fundamentally different, then how does this accord with our notions of symmetry? Answering these questions is not only important as far as our predominant cosmological theories go, they are also intrinsic to understanding how the weak interactions that govern particles work.
Established in June of 2011, the international T2K collaboration is the first experiment in the world dedicated to answering this mystery by studying neutrino and anti-neutrino oscillations. The experiment begins with high-intensity beams of muon neutrinos (or muon anti-neutrinos) being generated at the Japan Proton Accelerator Research Complex (J-PARC), which are then fired towards the Super-Kamiokande detector 295 km away.
This detector is currently one of the world’s largest and most sophisticated, dedicated to the detection and study of solar and atmospheric neutrinos. As neutrinos travel between the two facilities, they change “flavor” – going from muon neutrinos or anti-neutrinos to electron neutrinos or anti-neutrinos. In monitoring these neutrino and anti-neutrino beams, the experiment watches for different rates of oscillation.
This difference in oscillation would show that there is an imbalance between particles and antiparticles, and thus provide the first definitive evidence of CP violation for the first time. It would also indicate that there are physics beyond the Standard Model that scientists have yet to probe. This past April, the first data set produced by T2K was released, which provided some telling results.
As Mark Hartz, a T2K collaborator and the Kavli IPMU Project Assistant Professor, said in a recent press release:
“While the data sets are still too small to make a conclusive statement, we have seen a weak preference for large CP violation and we are excited to continue to collect data and make a more sensitive search for CP violation.”
These results, which were recently published in the Physical Review Letters, include all data runs from between January 2010 to May 2016. In total, this data comprised 7.482 x 1020 protons (in neutrino mode), which yielded 32 electron neutrino and 135 muon neutrino events, and 7.471×1020 protons (in antineutrino mode), which yielded 4 electron anti-neutrino and 66 muon neutrino events.
In other words, the first batch of data has provided some evidence for CP violation, and with a confidence interval of 90%. But this is just the beginning, and the experiment is expected to run for another ten years before wrapping up. “If we are lucky and the CP violation effect is large, we may expect 3 sigma evidence, or about 99.7% confidence level, for CP violation by 2026,” said Hartz.
If the experiment proves successful, physicists may finally be able to answer how it is that the early Universe didn’t annihilate itself. It is also likely help to reveal aspects of the Universe that particle physicists are anxious to get into! For it here that the answers to the deepest secrets of the Universe, like how all of its fundamental forces fit together, are likely to be found.
For some time, physicists have understood that all known phenomena in the Universe are governed by four fundamental forces. These include weak nuclear force, strong nuclear force, electromagnetism and gravity. Whereas the first three forces of are all part of the Standard Model of particle physics, and can be explained through quantum mechanics, our understanding of gravity is dependent upon Einstein’s Theory of Relativity.
Understanding how these four forces fit together has been the aim of theoretical physics for decades, which in turn has led to the development of multiple theories that attempt to reconcile them (i.e. Super String Theory, Quantum Gravity, Grand Unified Theory, etc). However, their efforts may be complicated (or helped) thanks to new research that suggests there might just be a fifth force at work.
In a paper that was recently published in the journal Physical Review Letters, a research team from the University of California, Irvine explain how recent particle physics experiments may have yielded evidence of a new type of boson. This boson apparently does not behave as other bosons do, and may be an indication that there is yet another force of nature out there governing fundamental interactions.
As Jonathan Feng, a professor of physics & astronomy at UCI and one of the lead authors on the paper, said:
“If true, it’s revolutionary. For decades, we’ve known of four fundamental forces: gravitation, electromagnetism, and the strong and weak nuclear forces. If confirmed by further experiments, this discovery of a possible fifth force would completely change our understanding of the universe, with consequences for the unification of forces and dark matter.”
The efforts that led to this potential discovery began back in 2015, when the UCI team came across a study from a group of experimental nuclear physicists from the Hungarian Academy of Sciences Institute for Nuclear Research. At the time, these physicists were looking into a radioactive decay anomaly that hinted at the existence of a light particle that was 30 times heavier than an electron.
In a paper describing their research, lead researcher Attila Krasznahorka and his colleagues claimed that what they were observing might be the creation of “dark photons”. In short, they believed that they might have at last found evidence of Dark Matter, the mysterious, invisible mass that makes up about 85% of the Universe’s mass.
This report was largely overlooked at the time, but gained widespread attention earlier this year when Prof. Feng and his research team found it and began assessing its conclusions. But after studying the Hungarian teams results and comparing them to previous experiments, they concluded that the experimental evidence did not support the existence of dark photons.
Instead, they proposed that the discovery could indicate the possible presence of a fifth fundamental force of nature. These findings were published in arXiv in April, which was followed-up by a paper titled “Particle Physics Models for the 17 MeV Anomaly in Beryllium Nuclear Decays“, which was published in PRL this past Friday.
Essentially, the UCI team argue that instead of a dark photon, what the Hungarian research team might have witnessed was the creation of a previously undiscovered boson – which they have named the “protophobic X boson”. Whereas other bosons interact with electrons and protons, this hypothetical boson interacts with only electrons and neutrons, and only at an extremely limited range.
This limited interaction is believed to be the reason why the particle has remained unknown until now, and why the adjectives “photobic” and “X” are added to the name. “There’s no other boson that we’ve observed that has this same characteristic,” said Timothy Tait, a professor of physics & astronomy at UCI and the co-author of the paper. “Sometimes we also just call it the ‘X boson,’ where ‘X’ means unknown.”
If such a particle does exist, the possibilities for research breakthroughs could be endless. Feng hopes it could be joined with the three other forces governing particle interactions (electromagnetic, strong and weak nuclear forces) as a larger, more fundamental force. Feng also speculated that this possible discovery could point to the existence of a “dark sector” of our universe, which is governed by its own matter and forces.
“It’s possible that these two sectors talk to each other and interact with one another through somewhat veiled but fundamental interactions,” he said. “This dark sector force may manifest itself as this protophopic force we’re seeing as a result of the Hungarian experiment. In a broader sense, it fits in with our original research to understand the nature of dark matter.”
If this should prove to be the case, then physicists may be closer to figuring out the existence of dark matter (and maybe even dark energy), two of the greatest mysteries in modern astrophysics. What’s more, it could aid researchers in the search for physics beyond the Standard Model – something the researchers at CERN have been preoccupied with since the discovery of the Higgs Boson in 2012.
But as Feng notes, we need to confirm the existence of this particle through further experiments before we get all excited by its implications:
“The particle is not very heavy, and laboratories have had the energies required to make it since the ’50s and ’60s. But the reason it’s been hard to find is that its interactions are very feeble. That said, because the new particle is so light, there are many experimental groups working in small labs around the world that can follow up the initial claims, now that they know where to look.”
As the recent case involving CERN – where LHC teams were forced to announce that they had not discovered two new particles – demonstrates, it is important not to count our chickens before they are roosted. As always, cautious optimism is the best approach to potential new findings.
The early 20th century was a very auspicious time for the sciences. In addition to Ernest Rutherford and Niels Bohr giving birth to the Standard Model of particle physics, it was also a period of breakthroughs in the field of quantum mechanics. Thanks to ongoing studies on the behavior of electrons, scientists began to propose theories whereby these elementary particles behaved in ways that defied classical, Newtonian physics.
One such example is the Electron Cloud Model proposed by Erwin Schrodinger. Thanks to this model, electrons were no longer depicted as particles moving around a central nucleus in a fixed orbit. Instead, Schrodinger proposed a model whereby scientists could only make educated guesses as to the positions of electrons. Hence, their locations could only be described as being part of a ‘cloud’ around the nucleus where the electrons are likely to be found.
Atomic Physics To The 20th Century:
The earliest known examples of atomic theory come from ancient Greece and India, where philosophers such as Democritus postulated that all matter was composed of tiny, indivisible and indestructible units. The term “atom” was coined in ancient Greece and gave rise to the school of thought known as “atomism”. However, this theory was more of a philosophical concept than a scientific one.
It was not until the 19th century that the theory of atoms became articulated as a scientific matter, with the first evidence-based experiments being conducted. For example, in the early 1800’s, English scientist John Dalton used the concept of the atom to explain why chemical elements reacted in certain observable and predictable ways. Through a series of experiments involving gases, Dalton went on to develop what is known as Dalton’s Atomic Theory.
This theory expanded on the laws of conversation of mass and definite proportions and came down to five premises: elements, in their purest state, consist of particles called atoms; atoms of a specific element are all the same, down to the very last atom; atoms of different elements can be told apart by their atomic weights; atoms of elements unite to form chemical compounds; atoms can neither be created or destroyed in chemical reaction, only the grouping ever changes.
Discovery Of The Electron:
By the late 19th century, scientists also began to theorize that the atom was made up of more than one fundamental unit. However, most scientists ventured that this unit would be the size of the smallest known atom – hydrogen. By the end of the 19th century, his would change drastically, thanks to research conducted by scientists like Sir Joseph John Thomson.
Through a series of experiments using cathode ray tubes (known as the Crookes’ Tube), Thomson observed that cathode rays could be deflected by electric and magnetic fields. He concluded that rather than being composed of light, they were made up of negatively charged particles that were 1ooo times smaller and 1800 times lighter than hydrogen.
This effectively disproved the notion that the hydrogen atom was the smallest unit of matter, and Thompson went further to suggest that atoms were divisible. To explain the overall charge of the atom, which consisted of both positive and negative charges, Thompson proposed a model whereby the negatively charged “corpuscles” were distributed in a uniform sea of positive charge – known as the Plum Pudding Model.
These corpuscles would later be named “electrons”, based on the theoretical particle predicted by Anglo-Irish physicist George Johnstone Stoney in 1874. And from this, the Plum Pudding Model was born, so named because it closely resembled the English desert that consists of plum cake and raisins. The concept was introduced to the world in the March 1904 edition of the UK’sPhilosophical Magazine, to wide acclaim.
Development Of The Standard Model:
Subsequent experiments revealed a number of scientific problems with the Plum Pudding model. For starters, there was the problem of demonstrating that the atom possessed a uniform positive background charge, which came to be known as the “Thomson Problem”. Five years later, the model would be disproved by Hans Geiger and Ernest Marsden, who conducted a series of experiments using alpha particles and gold foil – aka. the “gold foil experiment.”
In this experiment, Geiger and Marsden measured the scattering pattern of the alpha particles with a fluorescent screen. If Thomson’s model were correct, the alpha particles would pass through the atomic structure of the foil unimpeded. However, they noted instead that while most shot straight through, some of them were scattered in various directions, with some going back in the direction of the source.
Geiger and Marsden concluded that the particles had encountered an electrostatic force far greater than that allowed for by Thomson’s model. Since alpha particles are just helium nuclei (which are positively charged) this implied that the positive charge in the atom was not widely dispersed, but concentrated in a tiny volume. In addition, the fact that those particles that were not deflected passed through unimpeded meant that these positive spaces were separated by vast gulfs of empty space.
By 1911, physicist Ernest Rutherford interpreted the Geiger-Marsden experiments and rejected Thomson’s model of the atom. Instead, he proposed a model where the atom consisted of mostly empty space, with all its positive charge concentrated in its center in a very tiny volume, that was surrounded by a cloud of electrons. This came to be known as the Rutherford Model of the atom.
Subsequent experiments by Antonius Van den Broek and Niels Bohr refined the model further. While Van den Broek suggested that the atomic number of an element is very similar to its nuclear charge, the latter proposed a Solar-System-like model of the atom, where a nucleus contains the atomic number of positive charge and is surrounded by an equal number of electrons in orbital shells (aka. the Bohr Model).
The Electron Cloud Model:
During the 1920s, Austrian physicist Erwin Schrodinger became fascinated by the theories Max Planck, Albert Einstein, Niels Bohr, Arnold Sommerfeld, and other physicists. During this time, he also became involved in the fields of atomic theory and spectra, researching at the University of Zurich and then the Friedrich Wilhelm University in Berlin (where he succeeded Planck in 1927).
In 1926, Schrödinger tackled the issue of wave functions and electrons in a series of papers. In addition to describing what would come to be known as the Schrodinger equation – a partial differential equation that describes how the quantum state of a quantum system changes with time – he also used mathematical equations to describe the likelihood of finding an electron in a certain position.
This became the basis of what would come to be known as the Electron Cloud (or quantum mechanical) Model, as well as the Schrodinger equation. Based on quantum theory, which states that all matter has properties associated with a wave function, the Electron Cloud Model differs from the Bohr Model in that it does not define the exact path of an electron.
Instead, it predicts the likely position of the location of the electron based on a function of probabilities. The probability function basically describes a cloud-like region where the electron is likely to be found, hence the name. Where the cloud is most dense, the probability of finding the electron is greatest; and where the electron is less likely to be, the cloud is less dense.
These dense regions are known as “electron orbitals”, since they are the most likely location where an orbiting electron will be found. Extending this “cloud” model to a 3-dimensional space, we see a barbell or flower-shaped atom (as in image at the top). Here, the branching out regions are the ones where we are most likely to find the electrons.
Thanks to Schrodinger’s work, scientists began to understand that in the realm of quantum mechanics, it was impossible to know the exact position and momentum of an electron at the same time. Regardless of what the observer knows initially about a particle, they can only predict its succeeding location or momentum in terms of probabilities.
At no given time will they be able to ascertain either one. In fact, the more they know about the momentum of a particle, the less they will know about its location, and vice versa. This is what is known today as the “Uncertainty Principle”.
Note that the orbitals mentioned in the previous paragraph are formed by a hydrogen atom (i.e. with just one electron). When dealing with atoms that have more electrons, the electron orbital regions spread out evenly into a spherical fuzzy ball. This is where the term ‘electron cloud’ is most appropriate.
This contribution was universally recognized as being one of the cost important contributions of the 20th century, and one which triggered a revolution in the fields of physics, quantum mechanics and indeed all the sciences. Thenceforth, scientists were no longer working in a universe characterized by absolutes of time and space, but in quantum uncertainties and time-space relativity!
The standard model of cosmology tells us that only 4.9% of the Universe is composed of ordinary matter (i.e. that which we can see), while the remainder consists of 26.8% dark matter and 68.3% dark energy. As the names would suggest, we cannot see them, so their existence has had to be inferred based on theoretical models, observations of the large-scale structure of the Universe, and its apparent gravitational effects on visible matter.
Since it was first proposed, there have been no shortages of suggestions as to what Dark Matter particles look like. Not long ago, many scientists proposed that Dark Matter consists of Weakly-Interacting Massive Particles (WIMPs), which are about 100 times the mass of a proton but interact like neutrinos. However, all attempts to find WIMPs using colliders experiments have come up empty. As such, scientists have been exploring the idea lately that dark matter may be composed of something else entirely. Continue reading “Beyond WIMPs: Exploring Alternative Theories Of Dark Matter”
The world’s most powerful particle collider is waking up from a well-earned rest. After roughly two years of heavy maintenance, scientists have nearly doubled the power of the Large Hadron Collider (LHC) in preparation for its next run. Now, it’s being cooled to just 1.9 degrees above absolute zero.
“We have unfinished business with understanding the universe,” said Tara Shears from the University of Liverpool in a news release. Shears and other LHC physicists will work to better understand the Higgs Boson and hopefully unravel some of the secrets of supersymmetry and dark matter.
On February 11, 2013 the LHC shut down for roughly two years. The break, known as LS1 for “long stop one,” was needed to correct several flaws in the original design of the collider.
The LHC’s first run got off to a rough start in 2008. Shortly after it was fired up, a single electrical connection triggered an explosion, damaging an entire sector (one-eighth) of the accelerator. To protect the accelerator from further disaster, scientists decided to run it at half power until all 10,000 copper connections could be repaired.
So over the last two years, scientists have worked around the clock to rework every single connection in the accelerator.
Now that the step (along with many others) is complete, the collider will operate at almost double its previous power. This was tested early last week, when scientists powered up the magnets of one sector to the level needed to reach the high energy expected in its second run.
With such a powerful new tool, scientists will look for deviations from their initial detection of the Higgs boson, potentially revealing a deeper level of physics that goes well beyond the Standard Model of particle physics.
Many theorists have turned to supersymmetry — the idea that for every known fundamental particle there exists a “supersymmetric” partner particle. If true, the enhanced LHC could be powerful enough to create supersymmetric particles themselves or prove their existence in subtler ways.
“The higher energy and more frequent proton collisions in Run 2 will allow us to investigate the Higgs particle in much more detail,” said Victoria Martin from Edinburgh University. “Higher energy may also allow the mysterious “dark matter” observed in galaxies to be made and studied in the lab for the first time.”
It’s possible that the Higgs could interact with — or even decay into — dark matter particles. If the latter occurs, then the dark matter particles would fly out of the LHC without ever being detected. But their absence would be evident.
So stay turned because these issues might be resolved in the spring of 2015 when the particle accelerator roars back to life.