Scientists at the Large Hadron Collider reported today they apparently have discovered a previously unobserved phenomenon in proton-proton collisions. One of the detectors shows that the colliding particles appear to be intimately linked in a way not seen before in proton collisions. The correlations were observed between particles produced in 7 TeV collisions. “The new feature has appeared in our analysis around the middle of July,” physicist Guido Tonelli told fellow CERN scientists at a seminar to present the findings from the collider’s CMS (Compact Muon Solenoid) detector.
The scientists said the effect is subtle and they have performed several detailed crosschecks and studies to ensure that it is real. It bears some similarity to effects seen in the collisions of nuclei at the RHIC facility located at the US Brookhaven National Laboratory, which have been interpreted as being possibly due to the creation of hot dense matter formed in the collisions.
CMS studies the collisions by measuring angular correlations between the particles as they fly away from the point of impact.
The scientists stressed that there are several potential explanations to be considered and the they presented their news to the physics community at CERN today in hopes of “fostering a broader discussion on the subject.”
“Now we need more data to analyze fully what’s going on, and to take our first steps into the vast landscape of new physics we hope the LHC will open up,” said Tonelli.
Proton running at the Large Hadron Collider is scheduled to continue until the end of October, during which time CMS will accumulate much more data to analyze. After that, and for the remainder of 2010, the LHC will collide lead nuclei.
In order for astronomers to explore the outer reaches of our universe, they rely upon the assumption that the physical constants we observe in the lab on Earth are physically constant everywhere in the universe. This assumption seems to hold up extremely well. If the universe’s constants were grossly different, stars would fail to shine and galaxies would fail to coalesce. Yet as far we we look in our universe, the effects which rely on these physical constants being constant, still seem to happen. But new research has revealed that one of these constants, known as the fine structure constant, may vary ever so slightly in different portions of the universe.
Of all physical constants, the fine structure constant seems like an odd one to be probing with astronomy. It appears in many equations involving some of the smallest scales in the universe. In particular, it is used frequently in quantum physics and is part of the quantum derivation of the structure of the hydrogen atom. This quantum model determines the allowed energy levels of electrons in the atoms. Change this constant and the orbitals shift as well.
Since the allowed energy levels determine what wavelengths of light such an atom can emit, a careful analysis of the positioning of these spectral lines in distant galaxies would reveal variations in the constant that helped control them. Using the Very Large Telescope (VLT) and the Keck Observatory, a team from the University of New South Whales has analyzed the spectra of 300 galaxies and found the subtle changes that should exist if this constant was less than constant.
Since the two sets of telescopes used point in different directions (Keck in the Northern hemisphere and the VLT in the Southern), the researchers noticed that the variation seemed to have a preferred direction. As Julian King, one of the paper’s authors, explained, “Looking to the north with Keck we see, on average, a smaller alpha in distant galaxies, but when looking south with the VLT we see a larger alpha.”
However, “it varies by only a tiny amount — about one part in 100,000 — over most of the observable universe”. As such, although the result is very intriguing, it does not demolish our understanding of the universe or make hypotheses like that of a greatly variable speed of light plausible (an argument frequently tossed around by Creationists). But, “If our results are correct, clearly we shall need new physical theories to satisfactorily describe them.”
While this finding doesn’t challenge our knowledge of the observable universe, it may have implications for regions outside of the portion of the universe we can observe. Since our viewing distance is ultimately limited by how far we can look back, and that time is limited by when the universe became transparent, we cannot observe what the universe would be like beyond that visible horizon. The team speculates that beyond it, there may be even larger changes in this constant which would have large effects on physics in such portions. They conclude the results may, “suggest a violation of the Einstein Equivalence Principle, and could infer a very large or infinite universe, within which our `local’ Hubble volume represents a tiny fraction, with correspondingly small variations in the physical constants.”
This would mean that, outside of our portion of the universe, the physical laws may not be suitable for life making our little corner of the universe a sort of oasis. This could help solve the supposed “fine-tuning” problem without relying on explanations such as multiple universes.
Want some other articles on this subject? Here’s an article about there might be 10 dimensions.
The idea of the “Theory of Everything” is enticing – that we could somehow explain all that is. String theory has been proposed since the 1960’s as a way to reconcile quantum mechanics and general relativity into such an explanation. However, the biggest criticism of String Theory is that it isn’t testable. But now, a research team led by scientists from the Imperial College London unexpectedly discovered that that string theory also seems to predict the behavior of entangled quantum particles. As this prediction can be tested in the laboratory, the researchers say they can now test string theory.
“If experiments prove that our predictions about quantum entanglement are correct, this will demonstrate that string theory ‘works’ to predict the behavior of entangled quantum systems,” said Professor Mike Duff, lead author of the study.
String theory was originally developed to describe the fundamental particles and forces that make up our universe, and has a been a favorite contender among physicists to allow us to reconcile what we know about the incredibly small from particle physics with our understanding of the very large from our studies of cosmology. Using the theory to predict how entangled quantum particles behave provides the first opportunity to test string theory by experiment.
But – at least for now – the scientists won’t be able to confirm that String Theory is actually the way to explain all that is, just if it actually works.
“This will not be proof that string theory is the right ‘theory of everything’ that is being sought by cosmologists and particle physicists,” said Duff. “However, it will be very important to theoreticians because it will demonstrate whether or not string theory works, even if its application is in an unexpected and unrelated area of physics.”
String theory is a theory of gravity, an extension of General Relativity, and the classical interpretation of strings and branes is that they are quantum mechanical vibrating, extended charged black holes.The theory hypothesizes that the electrons and quarks within an atom are not 0-dimensional objects, but 1-dimensional strings. These strings can move and vibrate, giving the observed particles their flavor, charge, mass and spin. The strings make closed loops unless they encounter surfaces, called D-branes, where they can open up into 1-dimensional lines. The endpoints of the string cannot break off the D-brane, but they can slide around on it.
Duff said he was sitting in a conference in Tasmania where a colleague was presenting the mathematical formulae that describe quantum entanglement when he realized something. “I suddenly recognized his formulae as similar to some I had developed a few years earlier while using string theory to describe black holes. When I returned to the UK I checked my notebooks and confirmed that the maths from these very different areas was indeed identical.”
Duff and his colleagues realized that the mathematical description of the pattern of entanglement between three qubits resembles the mathematical description, in string theory, of a particular class of black holes. Thus, by combining their knowledge of two of the strangest phenomena in the universe, black holes and quantum entanglement, they realized they could use string theory to produce a prediction that could be tested. Using the string theory mathematics that describes black holes, they predicted the pattern of entanglement that will occur when four qubits are entangled with one another. (The answer to this problem has not been calculated before.) Although it is technically difficult to do, the pattern of entanglement between four entangled qubits could be measured in the laboratory and the accuracy of this prediction tested.
The discovery that string theory seems to make predictions about quantum entanglement is completely unexpected, but because quantum entanglement can be measured in the lab, it does mean that there is way – finally – researchers can test predictions based on string theory.
But, Duff said, there is no obvious connection to explain why a theory that is being developed to describe the fundamental workings of our universe is useful for predicting the behavior of entangled quantum systems. “This may be telling us something very deep about the world we live in, or it may be no more than a quirky coincidence”, said Duff. “Either way, it’s useful.”
Atoms are made of protons, neutrons and electrons. If you cram them together and heat them up you get plasma where the electrons are only loosely associated with individual nuclei and you get a dynamic, light-emitting mix of positively charged ions and negatively charged electrons. If you cram that matter together even further, you drive electrons to merge with protons and you are left with a collection of neutrons – like in a neutron star. So, what if you keep cramming that collection of neutrons together into an even higher density? Well, eventually you get a black hole – but before that (at least hypothetically) you get a strange star.
The theory has it that compressing neutrons can eventually overcome the strong interaction, breaking down a neutron into its constituent quarks, giving a roughly equal mix of up, down and strange quarks – allowing these particles to be crammed even closer together in a smaller volume. By convention, this is called strange matter. It has been suggested that very massive neutron stars may have strange matter in their compressed cores.
However, some say that strange matter has a more fundamentally stable configuration than other matter. So, once a star’s core becomes strange, contact between it and baryonic (i.e. protons and neutrons) matter might drive the baryonic matter to adopt the strange (but more stable) matter configuration. This is the sort of thinking behind why the Large Hadron Collider might have destroyed the Earth by producing strangelets, which then produce a Kurt Vonnegut Ice-9 scenario. However, since the LHC hasn’t done any such thing, it’s reasonable to think that strange stars probably don’t form this way either.
More likely a ‘naked’ strange star, with strange matter extending from its core to its surface, might evolve naturally under its own self gravity. Once a neutron star’s core becomes strange matter, it should contract inwards leaving behind volume for an outer layer to be pulled inwards into a smaller radius and a higher density, at which point that outer layer might also become strange… and so on. Just as it seems implausible to have a star whose core is so dense that it’s essentially a black hole, but still with a star-like crust – so it may be that when a neutron star develops a strange core it inevitably becomes strange throughout.
Anyhow, if they exist at all, strange stars should have some tell tale characteristics. We know that neutron stars tend to lie in the range of 1.4 to 2 solar masses – and that any star with a neutron star’s density that’s over 10 solar masses has to become a black hole. That leaves a bit of a gap – although there is evidence of stellar black holes down to only 3 solar masses, so the gap for strange stars to form may only be in that 2 to 3 solar masses range.
The likely electrodynamic properties of strange stars are also of interest (see below). It is likely that electrons will be displaced towards the surface – leaving the body of the star with a nett positive charge surrounded by an atmosphere of negatively charged electrons. Presuming a degree of differential rotation between the star and its electron atmosphere, such a structure would generate a magnetic field of the magnitude that can be observed in a number of candidate stars.
Another distinct feature should be a size that is smaller than most neutron stars. One strange star candidate is RXJ1856, which appears to be a neutron star, but is only 11 km in diameter. Some astrophysicists may have muttered hmmm… that’s strange on hearing about it – but it remains to be confirmed that it really is.
Cosmologists – and not particle physicists — could be the ones who finally measure the mass of the elusive neutrino particle. A group of cosmologists have made their most accurate measurement yet of the mass of these mysterious so-called “ghost particles.” They didn’t use a giant particle detector but used data from the largest survey ever of galaxies, the Sloan Digital Sky Survey. While previous experiments had shown that neutrinos have a mass, it is thought to be so small that it was very hard to measure. But looking at the Sloan data on galaxies, PhD student Shawn Thomas and his advisers at University College London put the mass of a neutrino at no greater than 0.28 electron volts, which is less than a billionth of the mass of a single hydrogen atom. This is one of the most accurate measurements of the mass of a neutrino to date.
Their work is based on the principle that the huge abundance of neutrinos (there are trillions passing through you right now) has a large cumulative effect on the matter of the cosmos, which naturally forms into “clumps” of groups and clusters of galaxies. As neutrinos are extremely light they move across the universe at great speeds which has the effect of smoothing this natural “clumpiness” of matter. By analysing the distribution of galaxies across the universe (i.e. the extent of this “smoothing-out” of galaxies) scientists are able to work out the upper limits of neutrino mass.
A neutrino is capable of passing through a light year –about six trillion miles — of lead without hitting a single atom.
Central to this new calculation is the existence of the largest ever 3D map of galaxies, called Mega Z, which covers over 700,000 galaxies recorded by the Sloan Digital Sky Survey and allows measurements over vast stretches of the known universe.
“Of all the hypothetical candidates for the mysterious Dark Matter, so far neutrinos provide the only example of dark matter that actually exists in nature,” said Ofer Lahav, Head of UCL’s Astrophysics Group. “It is remarkable that the distribution of galaxies on huge scales can tell us about the mass of the tiny neutrinos.”
The Cosmologists at UCL were able to estimate distances to galaxies using a new method that measures the colour of each of the galaxies. By combining this enormous galaxy map with information from the temperature fluctuations in the after-glow of the Big Bang, called the Cosmic Microwave Background radiation, they were able to put one of the smallest upper limits on the size of the neutrino particle to date.
“Although neutrinos make up less than 1% of all matter they form an important part of the cosmological model,” said Dr. Shaun Thomas. “It’s fascinating that the most elusive and tiny particles can have such an effect on the Universe.”
“This is one of the most effective techniques available for measuring the neutrino masses,” said Dr. Filipe Abadlla. “This puts great hopes to finally obtain a measurement of the mass of the neutrino in years to come.”
The authors are confident that a larger survey of the Universe, such as the one they are working on called the international Dark Energy Survey, will yield an even more accurate weight for the neutrino, potentially at an upper limit of just 0.1 electron volts.
The results are published in the journal Physical Review Letters.
[/caption]
The strength of the magnetic fields here on Earth, on the Sun, in inter-planetary space, on stars in our galaxy (the Milky Way; some of them anyway), in the interstellar medium (ISM) in our galaxy, and in the ISM of other spiral galaxies (some of them anyway) have been measured. But there have been no measurements of the strength of magnetic fields in the space between galaxies (and between clusters of galaxies; the IGM and ICM).
Up till now.
But who cares? What scientific importance does the strength of the IGM and ICM magnetic fields have?
Estimates of these fields may provide “a clue that there was some fundamental process in the intergalactic medium that made magnetic fields,” says Ellen Zweibel, a theoretical astrophysicist at the University of Wisconsin, Madison. One “top-down” idea is that all of space was somehow left with a slight magnetic field soon after the Big Bang – around the end of inflation, Big Bang Nucleosynthesis, or decoupling of baryonic matter and radiation – and this field grew in strength as stars and galaxies amassed and amplified its intensity. Another, “bottom-up” possibility is that magnetic fields formed initially by the motion of plasma in small objects in the primordial universe, such as stars, and then propagated outward into space.
So how do you estimate the strength of a magnetic field, tens or hundreds of millions of light-years away, in regions of space a looong way from any galaxies (much less clusters of galaxies)? And how do you do this when you expect these fields to be much less than a nanoGauss (nG), perhaps as small as a femtoGauss (fG, which is a millionth of a nanoGauss)? What trick can you use??
A very neat one, one that relies on physics not directly tested in any laboratory, here on Earth, and unlikely to be so tested during the lifetime of anyone reading this today – the production of positron-electron pairs when a high energy gamma ray photon collides with an infrared or microwave one (this can’t be tested in any laboratory, today, because we can’t make gamma rays of sufficiently high energy, and even if we could, they’d collide so rarely with infrared light or microwaves we’d have to wait centuries to see such a pair produced). But blazars produce copious quantities of TeV gamma rays, and in intergalactic space microwave photons are plentiful (that’s what the cosmic microwave background – CMB – is!), and so too are far infrared ones.
Having been produced, the positron and electron will interact with the CMB, local magnetic fields, other electrons and positrons, etc (the details are rather messy, but were basically worked out some time ago), with the net result that observations of distant, bright sources of TeV gamma rays can set lower limits on the strength of the IGM and ICM through which they travel. Severalrecentpapers report results of such observations, using the Fermi Gamma-Ray Space Telescope, and the MAGIC telescope.
So how strong are these magnetic fields? The various papers give different numbers, from greater than a few tenths of a femtoGauss to greater than a few femtoGauss.
“The fact that they’ve put a lower bound on magnetic fields far out in intergalactic space, not associated with any galaxy or clusters, suggests that there really was some process that acted on very wide scales throughout the universe,” Zweibel says. And that process would have occurred in the early universe, not long after the Big Bang. “These magnetic fields could not have formed recently and would have to have formed in the primordial universe,” says Ruth Durrer, a theoretical physicist at the University of Geneva.
So, perhaps we have yet one more window into the physics of the early universe; hooray!
And now there’s a paper which seems, at last, to explain the observations; the cause is, apparently, a complex interplay of gravity, angular motion, and star formation.
[/caption]
It is now reasonably well-understood how supermassive black holes (SMBHs), found in the nuclei of all normal galaxies, can snack on stars, gas, and dust which comes within about a third of a light-year (magnetic fields do a great job of shedding the angular momentum of this ordinary, baryonic matter).
Also, disturbances from collisions with other galaxies and the gravitational interactions of matter within the galaxy can easily bring gas to distances of about 10 to 100 parsecs (30 to 300 light years) from a SMBH.
However, how does the SMBH snare baryonic matter that’s between a tenth of a parsec and ~10 parsecs away? Why doesn’t matter just form more-or-less stable orbits at these distances? After all, the local magnetic fields are too weak to make changes (except over very long timescales), and collisions and close encounters too rare (these certainly work over timescales of ~billions of years, as evidenced by the distributions of stars in globular clusters).
That’s where new simulations by Philip Hopkins and Eliot Quataert, both of the University of California, Berkeley, come into play. Their computer models show that at these intermediate distances, gas and stars form separate, lopsided disks that are off-center with respect to the black hole. The two disks are tilted with respect to one another, allowing the stars to exert a drag on the gas that slows its swirling motion and brings it closer to the black hole.
The new work is theoretical; however, Hopkins and Quataert note that several galaxies seem to have lopsided disks of elderly stars, lopsided with respect to the SMBH. And the best-studied of these is in M31.
Hopkins and Quataert now suggest that these old, off-center disks are the fossils of the stellar disks generated by their models. In their youth, such disks helped drive gas into black holes, they say.
The new study “is interesting in that it may explain such oddball [stellar disks] by a common mechanism which has larger implications, such as fueling supermassive black holes,” says Tod Lauer of the National Optical Astronomy Observatory in Tucson. “The fun part of their work,” he adds, is that it unifies “the very large-scale black hole energetics and fueling with the small scale.” Off-center stellar disks are difficult to observe because they lie relatively close to the brilliant fireworks generated by supermassive black holes. But searching for such disks could become a new strategy for hunting supermassive black holes in galaxies not known to house them, Hopkins says.
Sources: ScienceNews, “The Nuclear Stellar Disk in Andromeda: A Fossil from the Era of Black Hole Growth”, Hopkins, Quataert, to be published in MNRAS (arXiv preprint), AGN Fueling: Movies.
Physicists at the CERN research center collided sub-atomic particles in the Large Hadron Collider on Tuesday at the highest speeds ever achieved. “It’s a great day to be a particle physicist,” said CERN Director General Rolf Heuer. “A lot of people have waited a long time for this moment, but their patience and dedication is starting to pay dividends.” Already, the instruments in the LHC have recorded thousands of events, and at this writing, the LHC has had more than an hour of stable and colliding beams.
[/caption]
CERN announced that on March 30 they will attempt to circulate beams in the Large Hadron Collider at 3.5 TeV, the highest energy yet achieved in a particle accelerator. A live webcast will be shown of the event, and will include live footage from the control rooms for the LHC accelerator and all four LHC experiment, as well as a press conference after the first collisions are announced.
“With two beams at 3.5 TeV, we’re on the verge of launching the LHC physics program,” said CERN’s Director for Accelerators and Technology, Steve Myers. “But we’ve still got a lot of work to do before collisions. Just lining the beams up is a challenge in itself: it’s a bit like firing needles across the Atlantic and getting them to collide half way.”
The webcast will be available at a link to be announced, but the tentative schedule of events (subject to change) and more information can be found at this link.
Webcasts will also be available from the control rooms of the four LHC experiments: ALICE, ATLAS, CMS and LHCb. The webcasts will be primarily in English.
Between now and 30 March, the LHC team will be working with 3.5 TeV beams to commission the beam control systems and the systems that protect the particle detectors from stray particles. All these systems must be fully commissioned before collisions can begin.
“The LHC is not a turnkey machine,” said CERN Director General Rolf Heuer.“The machine is working well, but we’re still very much in a commissioning phase and we have to recognize that the first attempt to collide is precisely that. It may take hours or even days to get collisions.”
The last time CERN switched on a major new research machine, the Large Electron Positron collider, LEP, in 1989 it took three days from the first attempt to collide to the first recorded collisions.
The current Large Hadron Collider run began on 20 November 2009, with the first circulating beam at 0.45 TeV. Milestones were quick to follow, with twin circulating beams established by 23 November and a world record beam energy of 1.18 TeV being set on 30 November. By the time the LHC switched off for 2009 on 16 December, another record had been set with collisions recorded at 2.36 TeV and significant quantities of data recorded. Over the 2009 part of the run, each of the LHC’s four major experiments, ALICE, ATLAS, CMS and LHCb recorded over a million particle collisions, which were distributed smoothly for analysis around the world on the LHC computing grid. The first physics papers were soon to follow. After a short technical stop, beams were again circulating on 28 February 2010, and the first acceleration to 3.5 TeV was on 19 March.
Once 7 TeV collisions have been established, the plan is to run continuously for a period of 18-24 months, with a short technical stop at the end of 2010. This will bring enough data across all the potential discovery areas to firmly establish the LHC as the world’s foremost facility for high-energy particle physics.
[/caption]
Published in 1915, Einstein’s theory of general relativity (GR) passed its first big test just a few years later, when the predicted gravitational deflection of light passing near the Sun was observed during the 1919 solar eclipse.
In 1960, GR passed its first big test in a lab, here on Earth; the Pound-Rebka experiment. And over the nine decades since its publication, GR has passed test after test after test, always with flying colors (check out this review for an excellent summary).
But the tests have always been within the solar system, or otherwise indirect.
Now a team led by Princeton University scientists has tested GR to see if it holds true at cosmic scales. And, after two years of analyzing astronomical data, the scientists have concluded that Einstein’s theory works as well in vast distances as in more local regions of space.
The scientists’ analysis of more than 70,000 galaxies demonstrates that the universe – at least up to a distance of 3.5 billion light years from Earth – plays by the rules set out by Einstein in his famous theory. While GR has been accepted by the scientific community for over nine decades, until now no one had tested the theory so thoroughly and robustly at distances and scales that go way beyond the solar system.
Reinabelle Reyes, a Princeton graduate student in the Department of Astrophysical Sciences, along with co-authors Rachel Mandelbaum, an associate research scholar, and James Gunn, the Eugene Higgins Professor of Astronomy, outlined their assessment in the March 11 edition of Nature.
Other scientists collaborating on the paper include Tobias Baldauf, Lucas Lombriser and Robert Smith of the University of Zurich and Uros Seljak of the University of California-Berkeley.
The results are important, they said, because they shore up current theories explaining the shape and direction of the universe, including ideas about dark energy, and dispel some hints from other recent experiments that general relativity may be wrong.
“All of our ideas in astronomy are based on this really enormous extrapolation, so anything we can do to see whether this is right or not on these scales is just enormously important,” Gunn said. “It adds another brick to the foundation that underlies what we do.”
GR is one, of two, core theories underlying all of contemporary astrophysics and cosmology (the other is the Standard Model of particle physics, a quantum theory); it explains everything from black holes to the Big Bang.
In recent years, several alternatives to general relativity have been proposed. These modified theories of gravity depart from general relativity on large scales to circumvent the need for dark energy, dark matter, or both. But because these theories were designed to match the predictions of general relativity about the expansion history of the universe, a factor that is central to current cosmological work, it has become crucial to know which theory is correct, or at least represents reality as best as can be approximated.
“We knew we needed to look at the large-scale structure of the universe and the growth of smaller structures composing it over time to find out,” Reyes said. The team used data from the Sloan Digital Sky Survey (SDSS), a long-term, multi-institution telescope project mapping the sky to determine the position and brightness of several hundred million galaxies and quasars.
By calculating the clustering of these galaxies, which stretch nearly one-third of the way to the edge of the universe, and analyzing their velocities and distortion from intervening material – due to weak lensing, primarily by dark matter – the researchers have shown that Einstein’s theory explains the nearby universe better than alternative theories of gravity.
The Princeton scientists studied the effects of gravity on the SDSS galaxies and clusters of galaxies over long periods of time. They observed how this fundamental force drives galaxies to clump into larger collections of galaxies and how it shapes the expansion of the universe.
Critically, because relativity calls for the curvature of space to be equal to the curvature of time, the researchers could calculate whether light was influenced in equal amounts by both, as it should be if general relativity holds true.
“This is the first time this test was carried out at all, so it’s a proof of concept,” Mandelbaum said. “There are other astronomical surveys planned for the next few years. Now that we know this test works, we will be able to use it with better data that will be available soon to more tightly constrain the theory of gravity.”
Firming up the predictive powers of GR can help scientists better understand whether current models of the universe make sense, the scientists said.
“Any test we can do in building our confidence in applying these very beautiful theoretical things but which have not been tested on these scales is very important,” Gunn said. “It certainly helps when you are trying to do complicated things to understand fundamentals. And this is a very, very, very fundamental thing.”
“The nice thing about going to the cosmological scale is that we can test any full, alternative theory of gravity, because it should predict the things we observe,” said co-author Uros Seljak, a professor of physics and of astronomy at UC Berkeley and a faculty scientist at Lawrence Berkeley National Laboratory who is currently on leave at the Institute of Theoretical Physics at the University of Zurich. “Those alternative theories that do not require dark matter fail these tests.”
Sources: “Princeton scientists say Einstein’s theory applies beyond the solar system” (Princeton University), “Study validates general relativity on cosmic scale, existence of dark matter” (University of California Berkeley), “Confirmation of general relativity on large scales from weak lensing and galaxy velocities” (Nature, arXiv preprint)