What is Entropy?

After some time, this cold glass will reach thermal equilibrium

Perhaps there’s no better way to understand entropy than to grasp the second law of thermodynamics, and vice versa. This law states that the entropy of an isolated system that is not in equilibrium will increase as time progresses until equilibrium is finally achieved.

Let’s try to elaborate a little on this equilibrium thing. Note that in the succeeding examples, we’ll assume that they’re both isolated systems.

First example. Imagine putting a hot body and a cold body side by side. What happens after some time? That’s right. They both end up in the same temperature; one that is lower than the original temperature of the hotter one and higher than the original temperature of the colder one.

Second example. Ever heard of a low pressure area? It’s what weather reporters call a particular region that’s characterized by strong winds and perhaps some rain. This happens because all fluids flow from a region of high pressure to a region of low pressure. Thus, when the fluid, air in this case, comes rushing in, they do so in the form of strong winds. This goes on until the pressures in the adjacent regions even out.

In both cases, the physical quantities which started to be uneven between the two bodies/regions even out in the end, i.e., when equilibrium is achieved. The measurement of the extent of this evening-out process is called entropy.

During the process of attaining equilibrium, it is possible to tap into the system to perform work, as in a heat engine. Notice, however, that work can only be done for as long as there is a difference in temperature. Without it, like when maximum entropy has already been achieved, there is no way that work can be performed.

Since the concept of entropy applies to all isolated systems, it has been studied not only in physics but also in information theory, mathematics, as well as other branches of science and applied science.

Because the accepted view of the universe is that of one that is finite, then it can very well be considered as a closed system. As such, it should also be governed by the second law of thermodynamics. Thus, like in all isolated systems, the entropy of the universe is expected to be increasing.

So what? Well, also just like all isolated systems, the universe is therefore also expected to end up in a useless heap in equilibrium, a.k.a. a heat death, wherein energy can no longer be extracted from anymore. To give you some relief, not everyone involved in the study of cosmology is totally in agreement with entropy’s so-called role in the grand scheme of things though.

You can read more about entropy here in Universe Today. Want to know why time might flow in one direction? Have you ever thought about the time before the Big Bang? The entire entropy concept plays an important role in understanding them.

There’s more about entropy at NASA and Physics World too. Here are a couple of sources there:

Here are two episodes at Astronomy Cast that you might want to check out as well:

Source:
Hyperphysics

What If There Is Only One Universe?

When it comes to universes, perhaps one is enough after all.

Many theories in physics and cosmology require the existence of alternate, or parallel, universes.  But Dr. Lee Smolin of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, explains the flaws of theories that suggest our universe is just one of many, and which also perpetuate the notion that time does not exist.  Smolin, author of the bestselling science book ‘The Trouble with Physics’ and a founding member of the Perimeter Institute, explains his views in the June issue of Physics World.

Smolin explains how theories describing a myriad of possible universes, or a “multiverse”, with many dimensions and particles and forces have become more popular in the last few years. However, through his work with the Brazilian philosopher Roberto Mangabeira Unger, Smolin believes that multiverse theories, which imply that time is not a fundamental concept, are “profoundly mistaken”.

Smolin says a timeless multiverse means our laws of physics can’t be determined from experiment.  And he explains the unclear connection between fundamental laws, which are unique and applicable universally, and effective laws, which hold based on what we can actually observe.

Smolin suggests new principles that rethink the notion of physical law to apply to a single universe.  These principles say there is only one universe; that all that is real is real in a moment, as part of a succession of moments; and that everything real in each moment is a process of change leading to future moments. As he explains, “If there is just one universe, there is no reason for a separation into laws and initial conditions, as we want a law to explain just one history of one universe.”

He hopes these principles will bring a fresh adventure in science.

If we accept there is only one universe and that time is a fundamental property of nature, then this opens up the possibility that the laws of physics evolve with time. As Smolin writes, “The notion of transcending our time-bound experiences in order to discover truths that hold timelessly is an unrealizable fantasy. When science succeeds, we do nothing of the sort; what we physicists really do is discover laws that hold in the universe we experience within time. This, I would claim, should be enough; anything beyond that is more a religious urge for transcendence than science.”

Source: Institute of Physics

Is Everything Made of Mini Black Holes?

In 1971 physicist Stephen Hawking suggested that there might be “mini” black holes all around us that were created by the Big Bang. The violence of the rapid expansion following the beginning of the Universe could have squeezed concentrations of matter to form miniscule black holes, so small they can’t even be seen in a regular microscope. But what if these mini black holes were everywhere, and in fact, what if they make up the fabric of the universe? A new paper from two researchers in California proposes this idea.

Black holes are regions of space where gravity is so strong that not even light can escape, and are usually thought of as large areas of space, such as the supermassive black holes at the center of galaxies. No observational evidence of mini-black holes exists but, in principle, they could be present throughout the Universe.

Since black holes have gravity, they also have mass. But with mini black holes, the gravity would be weak. However, many physicists have assumed that even on the tiniest scale, the Planck scale, gravity regains its strength.

Experiments at the Large Hadron Collider are aimed at detecting mini black holes, but suffer from not knowing exactly how a reduced-Planck-mass black hole would behave, say Donald Coyne from UC Santa Cruz (now deceased) and D. C. Cheng from the Almaden Research Center near San Jose.

String theory also proposes that gravity plays a stronger role in higher dimensional space, but it is only in our four dimensional space that gravity appears weak.

Since these dimensions become important only on the Planck scale, it’s at that level that gravity re-asserts itself. And if that’s the case, then mini-black holes become a possibility, say the two researchers.

They looked at what properties black holes might have at such a small scale, and determined they could be quite varied.

Black holes lose energy and shrink in size as they do so, eventually vanishing, or evaporating. But this is a very slow process and only the smallest back holes will have had time to significantly evaporate over the enter 14 billion year history of the universe.

The quantization of space on this level means that mini-black holes could turn up at all kinds of energy levels. They predict the existence of huge numbers of black hole particles at different energy levels. And these black holes might be so common that perhaps “All particles may be varying forms of stabilized black holes.”

“At first glance the scenario … seems bizarre, but it is not,” Coyne and Cheng write. “This is exactly what would be expected if an evaporating black hole leaves a remnant consistent with quantum mechanics… This would put a whole new light on the process of evaporation of large black holes, which might then appear no different in principle from the correlated decays of elementary particles.”

They say their research need more experimentation. This may come from the LHC, which could begin to probe the energies at which these kinds of black holes will be produced.

Original paper.

Source: Technology Review

Is a Nearby Object in Space Beaming Cosmic Rays at Earth?

Fermi Telescope. Credit: NASA

[/caption]
Data from several different space and ground based observatories imply the presence of a nearby object that is beaming cosmic rays our way. Scientists with the Fermi Space Telescope say an unknown pulsar may be close by, sending electrons and positrons towards Earth. Or another more exotic explanation is that the particles could come from the annihilation of dark matter. But whatever it is, the source is relatively close, surely in our galaxy. “If these particles were emitted far away, they’d have lost a lot of their energy by the time they reached us,” said Luca Baldini, a Fermi collaborator.

Comparing data from the Fermi space telescope with results from the PAMELA spacecraft and the High Energy Stereoscopic System (H.E.S.S.) ground-based telescope, the three observatories have found surprisingly more particles with energies greater than 100 billion electron volts (100 GeV) than expected based on previous experiments and traditional models.

Fermi is primarily a gamma ray detector, but its Large Area Telescope (LAT) is also tool for investigating the high-energy electrons in cosmic rays.

Video of the LAT detecting high energy particles.

Cosmic rays are hyperfast electrons, positrons, and atomic nuclei moving at nearly the speed of light. Unlike gamma rays, which travel from their sources in straight lines, cosmic rays wend their way around the galaxy. They can ricochet off of galactic gas atoms or become whipped up and redirected by magnetic fields. These events randomize the particle paths and make it difficult to tell where they originated. But determining cosmic-ray sources is one of Fermi’s key goals.

Using the LAT, which is sensitive to electrons and their antimatter counterparts, positrons, the telescope looked at the energies of 4.5 million cosmic rays that struck the detector between Aug. 4, 2008, and Jan. 31, 2009 and found more of the high-energy variety than expected, those with more than 1 billion electron volts (eV).

A spokesman from the Goddard Space Flight Center said the exact number of how many more is not currently available, due to peculiarities of the data.

But results from Fermi also refute other recent findings from a balloon-borne experiment. The Advanced Thin Ionization Calorimeter (ATIC) captured evidence for a dramatic spike in the number of cosmic rays at energies around 500 GeV from its high atmospheric location over Antarctica. But Fermi did not detect these energies.

“Fermi would have seen this sharp feature if it was really there, but it didn’t.” said Luca Latronico, a team member at the National Institute of Nuclear Physics (INFN) in Pisa, Italy. “With the LAT’s superior resolution and more than 100 times the number of electrons collected by balloon-borne experiments, we are seeing these cosmic rays with unprecedented accuracy.”

“Fermi’s next step is to look for changes in the cosmic-ray electron flux in different parts of the sky,” Latronico said. “If there is a nearby source, that search will help us unravel where to begin looking for it.”

Source: NASA

Do We Need a New Theory of Gravitation?

Draco satellite dwarf galaxy. Credit: Mischa Schirmer, University of Bonn

[/caption]
A group of physicists say that the distribution of satellite galaxies that orbit the Milky Way, as well as the apparent dark matter within them, presents a direct challenge to Newton’s theory of gravitation, as the galaxies are not where they should be. “There is something odd about their distribution,” said Professor Pavel Kroupa from the University of Bonn in Germany. “They should be uniformly arranged around the Milky Way, but this is not what we found.” Standard cosmological models predict the presence of hundreds of these companions around most of the larger galaxies, but up to now only 30 have been observed around the Milky Way. The physicists say that Newton’s theory of gravitation should be modified.

The astronomers from Germany, Austria and Australia looked at the small dwarf galaxies that orbit the Milky Way and discovered that the eleven brightest of the dwarf galaxies lie more or less in the same plane – in a kind of disk shape – and that they revolve in the same direction around the Milky Way (in the same way as planets in the Solar System revolve around the Sun). Some of these contain only a few thousand stars and so are relatively faint and difficult to find.

Professor Kroupa and the other physicists believe that this can only be explained if today’s satellite galaxies were created by ancient collisions between young galaxies. Team member Dr. Manuel Metz said, “Fragments from early collisions can form the revolving dwarf galaxies we see today, but this introduces a paradox. Calculations suggest that the dwarf satellites cannot contain any dark matter if they were created in this way. But this directly contradicts other evidence. Unless the dark matter is present, the stars in the galaxies are moving around much faster than predicted by Newton’s standard theory of gravitation.”

Metz added, “The only solution is to reject Newton’s theory. If we live in a Universe where a modified law of gravitation applies, then our observations would be explainable without dark matter.”

With this evidence, the team share the convictions of a number of groups around the world who believe that some of the fundamental principles of physics have been incorrectly understood. If their ideas are correct, it will not be the first time that Newton’s theory of gravitation has been modified. In the 20th century it happened when Einstein introduced his Special and General Theories of Relativity and again when quantum mechanics was developed to explain physics on sub-atomic scales. The anomalies detected by Dr. Metz and Professor Kroupa and their collaborators imply that where weak accelerations predominate, a ‘modified Newtonian dynamic’ may have to be used. If the scientists are right then this has far-reaching consequences for our understanding of the Universe we live in.

The two studies will appear in papers in Monthly Notices of the Royal Astronomical Society and the Astrophysical Journal.

Source: RAS

Hawking Update: Condition Improved

Professor Stephen Hawking in 2006. Credit: Wikipedia

[/caption]
Physicist/mathematician Stephen Hawking has improved after spending the night at a hospital near his home in Cambridge, England, and the 67-year-old’s condition was described as “comfortable.” Hawking’s first wife, Jane, was quoted that she believed his illness was no longer life-threatening. A spokesperson for Cambridge University, where Prof Hawking holds the post of Lucasian Professor of Mathematics, said that he would be kept in hospital for observation. “He is comfortable and his family is looking forward to him making a full recovery,” said Gregory Hayman. “He has had a good night but will be kept in at Addenbrooke’s Hospital for observation. He is showing signs of improvement.”

Hawking, best known as the author of the best-selling science book A Brief History of Time, was taken to hospital two days after returning from a tour of engagements in the United States. Cambridge University took the unprecedented step of commenting on Hawking’s condition, describing him as “very ill”.

Hawking suffers from ALS (amyotrophic lateral sclerosis), an incurable degenerative disorder also known as Lou Gehrig’s disease. He is wheelchair bound and only able to speak with the help of a voice synthesiser.

He was diagnosed with the muscle-wasting disease at the age of 21, which has gradually robbed him of his voice and movement in his limbs.

At the time of being diagnosed with the disease, he was told that he could expect to live for two years but has become one of the oldest-known survivors of the disease, after more than 40 years.

Source: Telegraph

Physicist Hawking Gravely Ill

Stephen Hawking at NASA's StarChild Learning Center in the 1980s. Credit: NASA

[/caption]
Famed theoretical physicist Stephen Hawking has been rushed to a hospital and is seriously ill. Cambridge University released information today that Hawking has been fighting a chest infection for several weeks, and was taken to a hospital in Cambridge.”Professor Hawking is very ill,” said Gregory Hayman, the university’s head of communications. “He is undergoing tests. He has been unwell for a couple of weeks.” Hawking, 67, is well known for his work on black holes, and has remained active despite being diagnosed at 21 with ALS, (amyotrophic lateral sclerosis), an incurable degenerative disorder also known as Lou Gehrig’s disease.

For several years, Hawking has been almost entirely paralyzed, and he communicates through an electronic voice synthesizer.

“Professor Hawking is a remarkable colleague. We all hope he will be amongst us again soon,” said Professor Peter Haynes, head of the university’s Department of Applied Mathematics and Theoretical Physics.

Hawking had canceled an appearance at Arizona State University on April 6 because of his illness.

He announced last year that he would step down from his post as Lucasian Professor of Mathematics, a title once held by the great 18th century physicist Isaac Newton, and the end of this academic year. However, the university said Hawking intended to continue working as Emeritus Lucasian Professor of Mathematics.

Hawking has described himself as “lucky” despite his disease[29]. Its slow progression has allowed him time to make influential discoveries and it has not hindered him from having a very full life.

Source: Yahoo News

Small Engine For the Big Job of Testing Theory of Relativity

The FEEP. Credit: ESA

[/caption]
Researchers from the European Space Agency are testing what they describe as the smallest, yet most precisely controllable engine ever built for space. Measuring 10 centimeters (4 inches) across and making a faint blue glow as it runs, the Field Emission Electric Propulsion, or FEEP, engine produces an average thrust equivalent to the force of one falling hair. But its thrust range and controllability are far superior to more potent thrusters, and will be important for a future space mission that will test Einstein’s General Theory of Relativity.

“Most propulsion systems are employed to get a vehicle from A to B,” explained Davide Nicolini of the agency’s Scientific Projects Department, in charge of the engine research. “But with FEEP, the aim is to maintain a spacecraft in a fixed position, compensating for even the tiniest forces perturbing it, to an accuracy that no other engine design can match.”

Watching how objects behave when separated from all outside influences is a long-time ambition of physicists, but it can’t be done within Earth’s gravity field. So a next-decade mission called the LISA Pathfinder (Laser Interferometer Space Antenna) will fly 1.5 million km (900,000 miles) to one of the Lagrangian points, L-1. There, the Sun and Earth’s gravities cancel each other out, so that the behavior of a pair of free-floating test objects can be precisely monitored.

But to detach the experiment fully from the rest of the Universe there will still be some remaining per-turbations to overcome, most notably the slight but continuous pressure of sunlight itself. That’s where FEEP comes in. It operates on the same basic principle as other ion engines flown aboard ESA’s SMART-1 Moon mission and other spacecraft: the application of an electric field serves to accelerate electrically-charged atoms (known as ions), producing thrust.

But while the thrust of other ion engines is measured in millinewtons, FEEP’s performance is assessed in terms of micronewtons – a unit one thousand times smaller. The engine has a thrust range of 0.1 – 150 micronewtons, with a resolution capability better than 0.1 micronewtons in a time response of one-fifth of a second (190 milliseconds) or better.

The engine uses liquid metal caesium as propellant. Through capillary action—a phenomenon associated with surface tension—caesium flows between a pair of metal surfaces that end in a razor-sharp slit. The caesium stays at the mouth of the slit until an electric field is generated. This causes tiny cones to form in the liquid metal which have charged atoms shooting from their tips to create thrust.

Twelve thrusters will be used for the LISA Pathfinder. Working together with another propulsions system designed by NASA, the thrusters should yield directional control at least 100 times more accurate than any spacecraft before; down to a millionth of a millimeter.

LISA involves three satellites up to five million km (three million miles) apart and linked by lasers, orbiting the Sun. The aim is to detect ripples in space and time known as gravitational waves, predicted by Einstein’s theory of general relativity but so far undetected. The waves would cause tiny variations in the distance measured between the satellites.

The engine was tested last month, and once the tests are analyzed and the concept is proven, the FEEP technology has been earmarked for a broad range of other missions, including precision formation flying for astronomy, Earth observation and drag-free satellites for mapping variations in Earth’s gravity.

Source: ESA

Warp Drives Probably Impossible After All

No warp speed ahead

[/caption]

Just when I was getting excited about the possibility of travelling to distant worlds, scientists have uncovered a deep flaw with faster-than-light-speed travel. There appears to be a quantum limit on how fast an object can travel through space-time, regardless of whether we are able to create a bubble in space-time or not…

First off, we have no clue about how to generate enough energy to create a “bubble” in space-time. This idea was first put on a scientific grounding Michael Alcubierre from the University of Mexico in 1994, but before that was only popularized by science fiction universes such as Star Trek. However, to create this bubble we need some form of exotic matter fuel some hypothetical energy generator to output 1045 Joules (according to calculations by Richard K. Obousy and Gerald Cleaver in the paper “Putting the Warp into Warp Drive“). Physicists are not afraid of big numbers, and we are not afraid of words like “hypothetical” and “exotic”, but to put this energy in perspective, we would need to turn all of Jupiter’s mass into energy to even hope to distort space-time around an object.

This is a lot of energy.

If a sufficiently advanced human race could generate this much energy, I would argue that we would be masters of our Universe anyway, who would need warp drive when we could just as well create wormholes, star gates or access parallel universes. Yes, warp drive is science fiction, but it’s interesting to investigate this possibility and open up physical scenarios where warp drive might work. Let’s face it, anything less than light-speed travel is a real downer for our potential to travel to other star systems, so we need to keep our options open, not matter how futuristic.

The space-time bubble. Unfortunately, quantum physics may have the final word (Michael Alcubierre)
The space-time bubble. Unfortunately, quantum physics may have the final word (Richard K Obousy & Gerald Cleaver, 2008)
Although warp speed is highly theoretical, at least it is based on some real physics. It’s a mix of superstring and multi-dimensional theory, but warp speed seems to be possible, assuming a vast supply of energy. If we can “simply” squash the tightly curled extra-dimensions (greater than the “normal” four we live in) in front of a futuristic spacecraft and expand them behind, a bubble of stationary space will be created for the spacecraft to reside in. This way, the spaceship isn’t travelling faster than light inside the bubble, the bubble itself is zipping through the fabric of space-time, facilitating faster-than-light-speed travel. Easy.

Not so fast.

According to new research on the subject, quantum physics has something to say about our dreams of zipping through space-time faster than c. What’s more, Hawking radiation would most likely cook anything inside this theoretical space-time bubble anyway. The Universe does not want us to travel faster than the speed of light.

On one side, an observer located at the center of a superluminal warp-drive bubble would generically experience a thermal flux of Hawking particles,” says Stefano Finazzi and co-authors from the International School for Advanced Studies in Trieste, Italy. “On the other side, such Hawking flux will be generically extremely high if the exotic matter supporting the warp drive has its origin in a quantum field satisfying some form of Quantum Inequalities.”

In short, Hawking radiation (usually associated with the radiation of energy and therefore loss of mass of evaporating black holes) will be generated, irradiating the occupants of the bubble to unimaginably high temperatures. The Hawking radiation will be generated as horizons will form at the front and rear of the bubble. Remember those big numbers physicists aren’t afraid of? Hawking radiation is predicted to roast anything inside the bubble to a possible 1030K (the maximum possible temperature, the Planck temperature, is 1032K).

Even if we could overcome this obstacle, Hawking radiation appears to be symptomatic of an even bigger problem; the space-time bubble would be unstable, on a quantum level.

Most of all, we find that the RSET [renormalized stress-energy tensor] will exponentially grow in time close to, and on, the front wall of the superluminal bubble. Consequently, one is led to conclude that the warp-drive geometries are unstable against semiclassical back-reaction,” Finazzi adds.

However, if you wanted to create a space-time bubble for subluminal (less-than light speed) travel, no horizons form, and therefore no Hawking radiation is generated. In this case, you might not be beating the speed of light, but you do have a fast, and stable way of getting around the Universe. Unfortunately we still need “exotic” matter to create the space-time bubble in the first place…

Sources: “Semiclassical instability of dynamical warp drives,” Stefano Finazzi, Stefano Liberati, Carlos Barceló, 2009, arXiv:0904.0141v1 [gr-qc], “Investigation into Compactified Dimensions: Casimir Energies and Phenomenological Aspects,” Richard K. Obousy, 2009, arXiv:0901.3640v1 [gr-qc]

Via: The Physics arXiv Blog

Astrophysics Satellite Detects Dark Matter Clue?

[/caption]

An international collaboration of astronomers is reporting an unusual spike of atmospheric particles that could be a long-sought signature of dark matter.

The orbiting PAMELA satellite, an astro physics mission operated by Italy, Russia, Germany and Sweden, has detected a  glut of positrons — antimatter counterparts to electrons — in the energy range theorized to be associated with the decay of dark matter. The results appear in this week’s issue of the journal Nature.

Dark matter is the unseen substance that accounts for most of the mass of our universe, and the presence of which can be inferred from gravitational effects on visible matter. When dark matter particles are annihilated after contact with anti-matter, they should yield a variety of subatomic particles, including electrons and positrons.

Antiparticles account for a small fraction of cosmic rays and are also known to be produced in interactions between cosmic-ray nuclei and atoms in the interstellar medium, which is referred to as a ‘secondary source.” 

Previous statistically limited measurements of the ratio of positron and electron fluxes have been interpreted as evidence for a primary source for the positrons, as has an increase in the total electron-positron flux at energies between 300 and 600 GeV. Primary sources could include pulsars, microquasars or dark matter annihilation. 

Lead study author Oscar Adriani, an astrophysics researcher at the University of Florence in Italy, and his colleagues are reporting a positron to electron ratio that systematically increases in a way that could indicate dark matter annihilation.

The new paper reports a measurement of the positron fraction in the energy range 1.5–100GeV.

“We find that the positron fraction increases sharply over much of that range, in a way that appears to be completely inconsistent with secondary sources,” the authors wrote in the Nature paper. “We therefore conclude that a primary source, be it an astrophysical object or dark matter annihilation, is necessary.” Another feasible source for the anitmatter particles, besides dark matter annihilation, could be a pulsar, they note.

PAMELA, which stands for a Payload for Antimatter Matter Exploration and Light Nuclei Astrophysics, was launched in June 2006 and initially slated to last three years. Mission scientists now say it will continue to collect data until at least December 2009, which will help pin down whether the positrons are coming from dark matter anihilation or a single, nearby source.

Source: Nature (there is also an arXiv/astro-ph version here)