AMS Now Attached to the Space Station, Ready to Observe the Invisible Universe

The AMS sits near the center of this graphic, which shows where the experiment is located on the truss of the ISS. Credit: NASA

The long-awaited Alpha Magnetic Spectrometer, a particle physics detector that could unlock mysteries about dark matter and other cosmic radiation, has now been installed outside the International Space Station. It is the largest and most complex scientific instrument yet on board the orbiting laboratory, and will examine ten thousand cosmic-ray hits every minute, looking for nature’s best-kept particle secrets, searching for clues into the fundamental nature of matter.

“Thank you very much for the great ride and safe delivery of AMS to the station,” said Dr. Samuel Ting, speaking via radio to the crew on orbit who installed the AMS. Ting is the AMS Principal Investigator who has worked on the project for close to 20 years. “Your support and fantastic work have taken us one step closer to realizing the science potential of AMS. With your help, for the next 20 years, AMS on the station will provide us a better understanding of the origin of the universe.”

“Thank you, Sam,” Endeavour commander Mark Kelly radioed back, “I was just looking out the window of the orbiter and AMS looks absolutely fantastic on the truss. I know you guys are really excited and you’re probably getting data and looking at it already.”

By collecting and measuring vast numbers of cosmic rays and their energies, particle physicists hope to understand more about how and where they are born, since a long-standing mystery is where cosmic rays originate. They could be created in the magnetic fields of exploded stars, or perhaps in the hearts of active galaxies, or maybe in places as yet unseen by astronomers.

The AMS is actually AMS-02 – a prototype of the instrument, AMS-01, was launched on board the space shuttle in 1998, and showed great potential. But Ting and his collaborators from around the world knew that to make a significant contribution to particle science, they needed a detector that could be in space for a long period of time.

AMS-02 will operate on the ISS until at least 2020, and hopefully longer, depending on the life of the space station.

[/caption]

The AMS will also search for antimatter within the cosmic rays, and attempt to determine whether the antimatter is formed from collisions between particles of dark matter, the mysterious substance that astronomers believe may make up about 22% of the Universe.

There is also the remote chance that AMS-02 will detect a particle of anti-helium, left over from the Big Bang itself.

“The most exciting objective of AMS is to probe the unknown; to search for phenomena which exist in nature that we have not yet imagined nor had the tools to discover,” said Ting.

For more information about the AMS, NASA has a detailed article.

Source: ESA, NASA TV

Did the Early Universe Have Just One Dimension?

Planck all-sky image. Credit: ESA, HFI and LFI consortia.

[/caption]

From a University of Buffalo press release:

Did the early universe have just one spatial dimension? That’s the mind-boggling concept at the heart of a theory that physicist Dejan Stojkovic from the University at Buffalo and colleagues proposed in 2010. They suggested that the early universe — which exploded from a single point and was very, very small at first — was one-dimensional (like a straight line) before expanding to include two dimensions (like a plane) and then three (like the world in which we live today).

The theory, if valid, would address important problems in particle physics.

Now, in a new paper in Physical Review Letters, Stojkovic and Loyola Marymount University physicist Jonas Mureika describe a test that could prove or disprove the “vanishing dimensions” hypothesis.

Because it takes time for light and other waves to travel to Earth, telescopes peering out into space can, essentially, look back into time as they probe the universe’s outer reaches.

Gravitational waves can’t exist in one- or two-dimensional space. So Stojkovic and Mureika have reasoned that the Laser Interferometer Space Antenna (LISA), a planned international gravitational observatory, should not detect any gravitational waves emanating from the lower-dimensional epochs of the early universe.

Stojkovic, an assistant professor of physics, says the theory of evolving dimensions represents a radical shift from the way we think about the cosmos — about how our universe came to be.

The core idea is that the dimensionality of space depends on the size of the space we’re observing, with smaller spaces associated with fewer dimensions. That means that a fourth dimension will open up — if it hasn’t already — as the universe continues to expand.

The theory also suggests that space has fewer dimensions at very high energies of the kind associated with the early, post-big bang universe.

If Stojkovic and his colleagues are right, they will be helping to address fundamental problems with the standard model of particle physics, including the following:

The incompatibility between quantum mechanics and general relativity. Quantum mechanics and general relativity are mathematical frameworks that describe the physics of the universe. Quantum mechanics is good at describing the universe at very small scales, while relativity is good at describing the universe at large scales. Currently, the two theories are considered incompatible; but if the universe, at its smallest levels, had fewer dimensions, mathematical discrepancies between the two frameworks would disappear.

Physicists have observed that the expansion of the universe is speeding up, and they don’t know why. The addition of new dimensions as the universe grows would explain this acceleration. (Stojkovic says a fourth dimension may have already opened at large, cosmological scales.)

The standard model of particle physics predicts the existence of an as yet undiscovered elementary particle called the Higgs boson. For equations in the standard model to accurately describe the observed physics of the real world, however, researchers must artificially adjust the mass of the Higgs boson for interactions between particles that take place at high energies. If space has fewer dimensions at high energies, the need for this kind of “tuning” disappears.

“What we’re proposing here is a shift in paradigm,” Stojkovic said. “Physicists have struggled with the same problems for 10, 20, 30 years, and straight-forward extensions of the existing ideas are unlikely to solve them.”

“We have to take into account the possibility that something is systematically wrong with our ideas,” he continued. “We need something radical and new, and this is something radical and new.”

Because the planned deployment of LISA is still years away, it may be a long time before Stojkovic and his colleagues are able to test their ideas this way.

However, some experimental evidence already points to the possible existence of lower-dimensional space.

Specifically, scientists have observed that the main energy flux of cosmic ray particles with energies exceeding 1 teraelectron volt — the kind of high energy associated with the very early universe — are aligned along a two-dimensional plane.

If high energies do correspond with lower-dimensional space, as the “vanishing dimensions” theory proposes, researchers working with the Large Hadron Collider particle accelerator in Europe should see planar scattering at such energies.

Stojkovic says the observation of such events would be “a very exciting, independent test of our proposed ideas.”

Sources: EurekAlert, Physical Review Letters.

The Universe in a Chocolate Creme Egg

Can chocolate cream eggs help explain the mysteries of the Universe? As part of the University of Nottingham’s Sixty Symbols science video series, the Cadbury creme egg has been featured this week, with several eggcellent videos just in time for Easter. This one discusses the cosmological constant, and the possibility of how we might be surrounded by tiny eggs from another dimension. Surprisingly, scientists can explain and demonstrate the some fundamental scientific laws that govern the universe with yummy cream filled chocolate eggs. See more egg-themed discussions at Sixty Symbols.

No Joy for Dark Matter Detector’s First 100 Days

Bottom photomultiplier tube array on the XENON 100 detector. Credit: the XENON collaboration

[/caption]

We’re still mostly in the dark about Dark Matter, and the highly anticipated results from the XENON100 detector has perhaps shed a tad more light on the subject – by not making a detection in the first 100 days of the experiment. Researchers from the project say they have now been able to place the most stringent limits yet on the properties of dark matter.

To look for any possible hints of Dark Matter interacting with ordinary matter, the project has been looking for WIMPS — or weakly interacting massive particles – but for now, there is no new evidence for the existence of WIMPS, or Dark Matter either.

The extremely sensitive XENON100 detector is buried beneath the Gran Sasso mountain in central Italy, shielding it from cosmic radiation so it hopefully can detect WIMPS, hypothetical particles that might be heavier than atomic nuclei, and the most popular candidate for what Dark Matter might be made of. The detector consists of 62 kg of liquid xenon contained within a heavily shielded tank. If a WIMP would enter the detector, it should interact with the xenon nuclei to generate light and electric signals – which would be a kind of “You Have Won!” indicator.

Dark Matter is thought to make up more than 80% of all mass in the universe, but the nature of it is still unknown. Scientists believe that it is made up of exotic particles unlike the normal (baryonic) matter, which we, the Earth, Sun and stars are made of, and it is invisible so it has only been inferred from its gravitational effects.

The XENON detector ran from January to June 2010 for its first run, and in their paper on arxiv, the team revealed they found three candidate events that might be due to Dark Matter. But two of these were expected to appear anyway because of background noise, the team said, so their results are effectively negative.

Does this rule out the existence of WIMPS? Not necessarily – the team will keep working on their search. Plus, results from a preliminary analysis from11.2 days worth of data, taken during the experiment’s commissioning phase in October and November 2009, already set new upper limits on the interaction rate of WIMPs – the world’s best for WIMP masses below about 80 times the mass of a proton.

And the XENON100 team was optimistic. “These new results reveal the highest sensitivity reported as yet by any dark matter experiment, while placing the strongest constraints on new physics models for particles of dark matter,” the team said in a statement.

Read the team’s paper.

More info on XENON100

Sources: EurekAlert, physicsworld

A New Way to Visualize Warped Space and Time

By combining theory with computer simulations, Thorne and his colleagues at Two doughnut-shaped vortexes ejected by a pulsating black hole. Also shown at the center are two red and two blue vortex lines attached to the hole, which will be ejected as a third doughnut-shaped vortex in the next pulsation. Credit: The Caltech/Cornell SXS Collaboration

[/caption]

Trying to understand the warping of space and time is something like visualizing a scene from Alice in Wonderland where rooms can change sizes and locations. The most-used description of the warping of space-time is how a heavy object deforms a stretched elastic sheet. But in actuality, physicists say this warping is so complicated that they really haven’t been able to understand the details of what goes on. But new conceptual tools that combines theory and computer simulations are providing a better way to for scientists to visualize what takes place when gravity from an object or event changes the fabric of space.

Researchers at Caltech, Cornell University, and the National Institute for Theoretical Physics in South Africa developed conceptual tools that they call tendex lines and vortex lines which represent gravitation waves. The researchers say that tendex and vortex lines describe the gravitational forces caused by warped space-time and are analogous to the electric and magnetic field lines that describe electric and magnetic forces.

“Tendex lines describe the stretching force that warped space-time exerts on everything it encounters,” said says David Nichols, a Caltech graduate student who came up with the term ‘tendex.’. “Tendex lines sticking out of the Moon raise the tides on the Earth’s oceans, and the stretching force of these lines would rip apart an astronaut who falls into a black hole.”

Vortex lines, on the other hand, describe the twisting of space. So, if an astronaut’s body is aligned with a vortex line, it would get wrung like a wet towel.

Two spiral-shaped vortexes (yellow) of whirling space sticking out of a black hole, and the vortex lines (red curves) that form the vortexes. Credit: The Caltech/Cornell SXS Collaboration

They tried out the tools specifically on computer simulated black hole collisions, and saw that such impacts would produce doughnut-shaped vortex lines that fly away from the merged black hole like smoke rings. The researchers also found that a bundle of vortex lines spiral out of the black hole like water from a rotating sprinkler. Depending on the angles and speeds of the collisions, the vortex and tendex lines — or gravitational waves — would behave differently.

“Though we’ve developed these tools for black-hole collisions, they can be applied wherever space-time is warped,” says Dr. Geoffrey Lovelace, a member of the team from Cornell. “For instance, I expect that people will apply vortex and tendex lines to cosmology, to black holes ripping stars apart, and to the singularities that live inside black holes. They’ll become standard tools throughout general relativity.”

The researchers say the tendex and vortex lines provide a powerful new way to understand the nature of the universe. “Using these tools, we can now make much better sense of the tremendous amount of data that’s produced in our computer simulations,” says Dr. Mark Scheel, a senior researcher at Caltech and leader of the team’s simulation work.

Their paper has been published in the April 11 in the Physical Review Letters.

Source: CalTech

Particle Physicists See Something Little That Could be Really Big

The dijet invariant mass distribution seen by Fermilab. The blue histogram represents something that is not predicted by the Standard Model. Credit: Fermilab

[/caption]

Physicists from Fermilab have seen a “bump” in their data that could indicate a brand new particle unlike any ever seen before. If verified, this could re-write particle physics as we know it. “Essentially, the Tevatron has seen evidence for a new particle, 150 times mass of proton, that doesn’t behave like a standard Higgs particle,” said physicist Brian Cox on Twitter. “If this stands up to scrutiny and more data (there is not yet enough data for a “discovery”), then it is RIP Standard Model.”

“It was hard for us to not go crazy when we saw the results,” said Viviana Cavaliere from the University of Illinois (UIUC), one of the 500-member team working with the CDF particle detector at Fermi National Accelerator Laboratory in Batavia, Illinois, speaking on a webcast on April 6. “But for now, we need to stay focused on what we do know.”

The result comes from CDF’s (the Collider Detector at Fermilab) analysis of billions of collisions of protons and antiprotons produced by Fermilab’s Tevatron collider. In high energy collisions, subatomic particles can be detected that otherwise can’t be seen. Physicists try to identify the particles they see by studying the combinations of more-familiar particles into which they decay, while trying to find new particles, such as the theoretical Higgs Boson which is predicted by the Standard Model of particle physics.

The Standard Model contains a description of the elementary particles and forces inside atoms which make up everything around us. The model has been successful at making predictions that have been subsequently verified. There are sixteen named particles in the Standard Model, and the last particles discovered were the W and Z bosons in 1983, the top quark in 1995, and the tauon neutrino in 2000. But most physicists agree the Standard Model is probably not the final word in particle physics.

The researchers at Fermilab were searching for collisions that produced a W boson, which weighs about 87 times as much as a proton, as well as some other particles that disintegrate into two sprays of particles called “jets,” which are produced when a collision scatters out a particle called a quark.

Instead, they saw about 250 events which indicate a new particle weighing about 150 times as much as a proton, the team said at the webcast from Fermilab and in their paper on arXiv.

The researchers estimate the statistical chances of random jets or jet pairs from other sources producing a fake signal that strong at 1 in 1300.

The Standard Model does not predict anything like what was seen in the CDF experiment, and since this particle has not been seen before and appears to have some strange properties, the physicists want to verify and retest before claiming a discovery.

“If it is not a fluctuation, it is a new particle,” Cox said.

The Tevatron accelerator at Fermilab is scheduled to be shut down later this year, due to lack of funding and because of sentiments that it would be redundant to the Large Hadron Collider.

You can see more complete discussions and interpretations of the results at:

Cosmic Variance

Science News

MSNBC

Astronomy Without A Telescope – Doubly Special Relativity

The Large Hadron Collider - destined to deliver fabulous science data, but uncertain if these will include an evidence basis for quantum gravity theories. Credit: CERN.

[/caption]

General relativity, Einstein’s theory of gravity, gives us a useful basis for mathematically modeling the large scale universe – while quantum theory gives us a useful basis for modeling sub-atomic particle physics and the likely small-scale, high-energy-density physics of the early universe – nanoseconds after the Big Bang – which general relativity just models as a singularity and has nothing else to say on the matter.

Quantum gravity theories may have more to say. By extending general relativity into a quantized structure for space-time, maybe we can bridge the gap between small and large scale physics. For example, there’s doubly special relativity.

With conventional special relativity, two different inertial frames of reference may measure the speed of the same object differently. So, if you are on a train and throw a tennis ball forward, you might measure it moving at 10 kilometers an hour. But someone else standing on the train station platform watching your train pass by at 60 kilometers an hour, measures the speed of the ball at 60 + 10 – i.e. 70 kilometers an hour. Give or take a few nanometers per second, you are both correct.

However, as Einstein pointed out, do the same experiment where you shine a torch beam, rather than throw a ball, forward on the train – both you on the train and the person on the platform measure the torch beam’s speed as the speed of light – without that additional 60 kilometers an hour – and you are both correct.

It works out that for the person on the platform, the components of speed (distance and time) are changed on the train so that distances are contracted and time dilated (i.e. slower clocks). And by the math of Lorenz transformations, these effects become more obvious the faster than train goes. It also turns out that the mass of objects on the train increase as well – although, before anyone asks, the train can’t turn into a black hole even at 99.9999(etc) per cent of the speed of light.

Now, doubly special relativity, proposes that not only is the speed of light always the same regardless of your frame of reference, but Planck units of mass and energy are also always the same. This means that relativistic effects (like mass appearing to increase on the train) do not occur at the Planck (i.e. very small) scale – although at larger scales, doubly special relativity should deliver results indistinguishable from conventional special relativity.

The Planck spacecraft - an observatory exploring the universe and named after the founder of quantum theory. Coincidence? Credit: ESA.

Doubly special relativity might also be generalized towards a theory of quantum gravity – which, when extended up from the Planck scale, should deliver results indistinguishable from general relativity.

It turns out that at the Planck scale e = m, even though at macro scales e=mc2. And at the Planck scale, a Planck mass is 2.17645 × 10-8 kg – supposedly the mass of a flea’s egg – and has a Schwarzschild radius of a Planck length – meaning that if you compressed this mass into such a tiny volume, it would become a very small black hole containing one Planck unit of energy.

To put it another way, at the Planck scale, gravity becomes a significant force in quantum physics. Although really, all we are saying that is that there is one Planck unit of gravitational force between two Planck masses when separated by a Planck length – and by the way, a Planck length is the distance that light moves within one unit of Planck time!

And since one Planck unit of energy (1.22×1019 GeV) is considered the maximal energy of particles – it’s tempting to consider that this represents conditions expected in the Planck epoch, being the very first stage of the Big Bang.

It all sounds terribly exciting, but this line of thinking has been criticized as being just a trick to make the math work better, by removing important information about the physical systems under consideration. You also risk undermining fundamental principles of conventional relativity since, as the paper below outlines, a Planck length can be considered an invariable constant independent of an observer’s frame of reference while the speed of light does become variable at very high energy densities.

Nonetheless, since even the Large Hadron Collider is not expected to deliver direct evidence about what may or may not happen at the Planck scale – for now, making the math work better does seem to be the best way forward.

Further reading: Zhang et al. Photon Gas Thermodynamics in Doubly Special Relativity.

Astronomy Without A Telescope – Unreasonable Effectiveness

CAP

[/caption]

Gravitational waves are apparently devilishly difficult things to model with Einstein field equations, since they are highly dynamic and non-symmetric. Traditionally, the only way to get close to predicting the likely effects of gravity waves was to estimate the required Einstein equation parameters by assuming the objects causing the gravity waves did not generate strong gravity fields themselves – and nor did they move at velocities anywhere close to the speed of light.

Trouble is, the mostly likely candidate objects that might generate detectable gravity waves – close binary neutron stars and merging black holes – have exactly those properties. They are highly compact, very massive bodies that often move at relativistic (i.e. close to the speed of light) velocities.

Isn’t it weird then that the ‘guesstimate’ approach described above actually works brilliantly in predicting the behaviors of close massive binaries and merging black holes. Hence a recent paper titled: On the unreasonable effectiveness of post-Newtonian approximation in gravitational physics.

So, firstly no-one has yet detected gravity waves. But even in 1916, Einstein considered their existence likely and demonstrated mathematically that gravitational radiation should arise when you replace a spherical mass with a rotating dumbbell of the same mass which, due to its geometry, will generate dynamic ebb and flow effects on space-time as it rotates.

To test Einstein’s theory, it’s necessary to design very sensitive detecting equipment – and to date all such attempts have failed. Further hopes now largely rest on the Laser Interferometer Space Antenna (LISA), which is not expected to launch before 2025.

The proposed Laser Interferometer Space Antenna (LISA) system using laser interferometry to monitor the fluctuations in the relative distances between three spacecraft, arranged in an equilateral triangle with five million kilometer sides. Hopefully, this will be sensitive enough to detect gravity waves. Credit: NASA.

However, as well as sensitive detection equipment like LISA, you also need to calculate what sort of phenomena and what sort of data would represent definitive evidence of a gravity wave – which is where all the theory and math required to determine these expected values is vital.

Initially, theoreticians worked out a post-Newtonian (i.e. Einstein era) approximation (i.e. guesstimate) for a rotating binary system – although it was acknowledged that this approximation would only work effectively for a low mass, low velocity system – where any complicating relativistic and tidal effects, arising from the self-gravity and velocities of the binary objects themselves, could be ignored.

Then came the era of numerical relativity where the advent of supercomputers made it possible to actually model all the dynamics of close massive binaries moving at relativistic speeds, much as how supercomputers can model very dynamic weather systems on Earth.

Surprisingly, or if you like unreasonably, the calculated values from numerical relativity were almost identical to those calculated by the supposedly bodgy post-Newtonian approximation. The post-Newtonian approximation approach just isn’t supposed to work for these situations.

All the authors are left with is the possibility that gravitational redshift makes processes near very massive objects appear slower and gravitationally ‘weaker’ to an external observer than they really are. That could – kind of, sort of – explain the unreasonable effectiveness… but only kind of, sort of.

Further reading: Will, C. On the unreasonable effectiveness of the post-Newtonian approximation in gravitational physics.

Galaxy Size Matters … And This is Not a Rorschach Test

False color image of the Lockman-hole area of the sky at infrared wavelengths as imaged by the Herschel Space Observatory. Credit: ESA/SPIRE Consortium/HerMES Consortium

[/caption]

When it comes to forming stars, the size of a galaxy does matter, according to research out today in the online version of Nature.

But it doesn’t have to be as massive as we once thought.

Alexandre Amblard, an astrophysicist at the University of California, Irvine, and his colleagues used new data from the Herschel Space Observatory to peer into Lockman Hole area of the sky, where extragalactic light comes from star-forming galaxies out of reach for even the world’s most powerful telescopes.

The Lockman Hole is a patch of the sky, 15 square degrees, lying roughly between the pointer stars of the Big Dipper.

Called submillimetre galaxies, the study subjects emit light at wavelengths between the radio and infrared parts of the spectrum, so studying them requires novel approaches borrowing from both radio and optical astronomy. The galaxies by themselves are too blurry to be resolved with individual far-infrared telescopes – but their average properties can be observed and analyzed, which is exactly what Amblard and his colleagues did.

The authors measured variations in the intensity of extragalactic light at far-infrared wavelengths, and derived statistics for the level of clustering of light halos. They assume that the clustering reflects the underlying distribution of dark matter, and fit the data to a halo model of galaxy formation, which connects the spatial distribution of galaxies in the Universe to that of dark matter.

Distribution of dark matter when the Universe was about 3 billion years old, obtained from a numerical simulation of galaxy formation. The left panel displays the continuous distribution of dark matter particles, showing the typical wispy structure of the cosmic web, with a network of sheets and filaments, while the right panel highlights the dark matter halos representing the most efficient cosmic sites for the formation of star-bursting galaxies with a minimum dark matter halo mass of 300 billion times that of the Sun. Credit: VIRGO Consortium/Alexandre Amblard/ESA

Amblard and his colleagues discovered an enormous fact: the ‘haloes’ of dark matter that surround the Universe’s most active star-forming galaxies are each more massive than about 300 billion solar masses.

What’s even more interesting is that the new threshold for star formation is actually smaller than some previous estimates.

“I think there was one prediction that put the number around 5000 billion times that of the sun, but that was just a prediction from a theory of galaxy formation.“ said Asantha Cooray, also an astrophysicist at UC Irvine and second author on the new paper. The general consensus was that it may be between 100 to 1000 billion times the sun. We now have a more precise answer from this work.”

Cooray said he’s most excited “that we can look at a detailed image of the sky showing distant, star-forming galaxies and infer not only details about the stars and gas in those galaxies but also about the amount of dark matter needed to form such galaxies. Beyond inferring the presence, we still don’t know exactly what dark matter is.”

The results appear online ahead of print today on Nature’s website.

What Is A Singularity?

Artist's conception of the event horizon of a black hole. Credit: Victor de Schwanberg/Science Photo Library
Artist's conception of the event horizon of a black hole. Credit: Victor de Schwanberg/Science Photo Library

Ever since scientists first discovered the existence of black holes in our universe, we have all wondered: what could possibly exist beyond the veil of that terrible void? In addition, ever since the theory of General Relativity was first proposed, scientists have been forced to wonder, what could have existed before the birth of the Universe – i.e. before the Big Bang?

Interestingly enough, these two questions have come to be resolved (after a fashion) with the theoretical existence of something known as a Gravitational Singularity – a point in space-time where the laws of physics as we know them break down. And while there remain challenges and unresolved issues about this theory, many scientists believe that beneath veil of an event horizon, and at the beginning of the Universe, this was what existed.

Definition:

In scientific terms, a gravitational singularity (or space-time singularity) is a location where the quantities that are used to measure the gravitational field become infinite in a way that does not depend on the coordinate system. In other words, it is a point in which all physical laws are indistinguishable from one another, where space and time are no longer interrelated realities, but merge indistinguishably and cease to have any independent meaning.

Credit: ESA/Hubble, ESO, M. Kornmesser
This artist’s impression depicts a rapidly spinning supermassive black hole surrounded by an accretion disc. Credit: ESA/Hubble, ESO, M. Kornmesse

Origin of Theory:

Singularities were first predicated as a result of Einstein’s Theory of General Relativity, which resulted in the theoretical existence of black holes. In essence, the theory predicted that any star reaching beyond a certain point in its mass (aka. the Schwarzschild Radius) would exert a gravitational force so intense that it would collapse.

At this point, nothing would be capable of escaping its surface, including light. This is due to the fact the gravitational force would exceed the speed of light in vacuum – 299,792,458 meters per second (1,079,252,848.8 km/h; 670,616,629 mph).

This phenomena is known as the Chandrasekhar Limit, named after the Indian astrophysicist Subrahmanyan Chandrasekhar, who proposed it in 1930. At present, the accepted value of this limit is believed to be 1.39 Solar Masses (i.e. 1.39 times the mass of our Sun), which works out to a whopping 2.765 x 1030 kg (or 2,765 trillion trillion metric tons).

Another aspect of modern General Relativity is that at the time of the Big Bang (i.e. the initial state of the Universe) was a singularity. Roger Penrose and Stephen Hawking both developed theories that attempted to answer how gravitation could produce singularities, which eventually merged together to be known as the Penrose–Hawking Singularity Theorems.

Illustration of the Big Bang Theory
The Big Bang Theory: A history of the Universe starting from a singularity and expanding ever since. Credit: grandunificationtheory.com

According to the Penrose Singularity Theorem, which he proposed in 1965, a time-like singularity will occur within a black hole whenever matter reaches certain energy conditions. At this point, the curvature of space-time within the black hole becomes infinite, thus turning it into a trapped surface where time ceases to function.

The Hawking Singularity Theorem added to this by stating that a space-like singularity can occur when matter is forcibly compressed to a point, causing the rules that govern matter to break down. Hawking traced this back in time to the Big Bang, which he claimed was a point of infinite density. However, Hawking later revised this to claim that general relativity breaks down at times prior to the Big Bang, and hence no singularity could be predicted by it.

Some more recent proposals also suggest that the Universe did not begin as a singularity. These includes theories like Loop Quantum Gravity, which attempts to unify the laws of quantum physics with gravity. This theory states that, due to quantum gravity effects, there is a minimum distance beyond which gravity no longer continues to increase, or that interpenetrating particle waves mask gravitational effects that would be felt at a distance.

Types of Singularities:

The two most important types of space-time singularities are known as Curvature Singularities and Conical Singularities. Singularities can also be divided according to whether they are covered by an event horizon or not. In the case of the former, you have the Curvature and Conical; whereas in the latter, you have what are known as Naked Singularities.

A Curvature Singularity is best exemplified by a black hole. At the center of a black hole, space-time becomes a one-dimensional point which contains a huge mass. As a result, gravity become infinite and space-time curves infinitely, and the laws of physics as we know them cease to function.

Conical singularities occur when there is a point where the limit of every general covariance quantity is finite. In this case, space-time looks like a cone around this point, where the singularity is located at the tip of the cone. An example of such a conical singularity is a cosmic string, a type of hypothetical one-dimensional point that is believed to have formed during the early Universe.

And, as mentioned, there is the Naked Singularity, a type of singularity which is not hidden behind an event horizon. These were first discovered in 1991 by Shapiro and Teukolsky using computer simulations of a rotating plane of dust that indicated that General Relativity might allow for “naked” singularities.

In this case, what actually transpires within a black hole (i.e. its singularity) would be visible. Such a singularity would theoretically be what existed prior to the Big Bang. The key word here is theoretical, as it remains a mystery what these objects would look like.

For the moment, singularities and what actually lies beneath the veil of a black hole remains a mystery. As time goes on, it is hoped that astronomers will be able to study black holes in greater detail. It is also hoped that in the coming decades, scientists will find a way to merge the principles of quantum mechanics with gravity, and that this will shed further light on how this mysterious force operates.

We have many interesting articles about gravitational singularities here at Universe Today. Here is 10 Interesting Facts About Black Holes, What Would A Black Hole Look Like?, Was the Big Bang Just a Black Hole?, Goodbye Big Bang, Hello Black Hole?, Who is Stephen Hawking?, and What’s on the Other Side of a Black Hole?

If you’d like more info on singularity, check out these articles from NASA and Physlink.

Astronomy Cast has some relevant episodes on the subject. Here’s Episode 6: More Evidence for the Big Bang, and Episode 18: Black Holes Big and Small and Episode 21: Black Hole Questions Answered.

Sources: