Hey. We’re all aware of Einstein’s theories and how gravity affects light. We know it was proved during a total solar eclipse, but what we’ve never realized in observational astronomy is that light just might get bent by other gravitational influences. If it can happen from something as small as a star, then what might occur if you had a huge group of stars? Like a galaxy… Or a group of galaxies!
What’s new in the world of light? Astrophysicists at the Dark Cosmology Centre at the Niels Bohr Institute have now gone around the bend and came up with a method of measuring how outgoing light is affected by the gravity of galaxy clusters. Not only does each individual star and each individual galaxy possess its own gravity, but a galaxy group is held together by gravitational attraction as well. Sure, it stands to reason that gravity is affecting what we see – but there’s even more to it. Redshift…
“It is really wonderful. We live in an era with the technological ability to actually measure such phenomena as cosmological gravitational redshift”, says astrophysicist Radek Wojtak, Dark Cosmology Centre under the Niels Bohr Institute at the University of Copenhagen.
Together with team members Steen Hansen and Jens Hjorth, Wojtak has been collecting light data and measurements from 8,000 galaxy clusters. Their studies have included calculations from mid-placed members to calibrations on those that reside at the periphery.
“We could measure small differences in the redshift of the galaxies and see that the light from galaxies in the middle of a cluster had to ‘crawl’ out through the gravitational field, while it was easier for the light from the outlying galaxies to emerge”, explains Radek Wojtak.
The next step in the equation is to measure the entire galaxy cluster’s total mass to arrive at its gravitational potential. Then, using the general theory of relativity, the gravitational redshift could be determined by galaxy location.
“It turned out that the theoretical calculations of the gravitational redshift based on the general theory of relativity was in complete agreement with the astronomical observations.” explains Wojtak. “Our analysis of observations of galaxy clusters show that the redshift of the light is proportionally offset in relation to the gravitational influence from the galaxy cluster’s gravity. In that way our observations confirm the theory of relativity.”
Of course, this kind of revelation also has other implications… theoretical dark matter just might play a role in gravitational redshift, too. And don’t forget dark energy. All these hypothetical models need to be taken into account. But, for now, we’re looking at the big picture in a different way.
“Now the general theory of relativity has been tested on a cosmological scale and this confirms that the general theory of relativity works and that means that there is a strong indication for the presence of dark energy”, explains Radek Wojtak.
As Walt Whitman once said, “I open the scuttle at night and see the far-sprinkled systems, And all I see multiplied as high as I can cypher edge but the rim of the farther systems. Wider and wider they spread, expanding, always expanding,Outward and outward and forever outward.”
Nope. A standard candle isn’t the same red, green, blue, yellow and omni-present pink wax sticks that decorate your every day birthday cake. Until now a standard candle meant a Cepheid variable star – or more recently – a Type 1a supernova. But something new happens almost every day in astronomy, doesn’t it? So start thinking about how an active galactic nucleus could be used to determine distance…
“Accurate distances to celestial objects are key to establishing the age and energy density of the Universe and the nature of dark energy.” says Darach Watson (et al). “A distance measure using active galactic nuclei (AGN) has been sought for more than forty years, as they are extremely luminous and can be observed at very large distances.”
So how is it done? As we know, active galactic nuclei are home to supermassive black holes which unleash powerful radiation. When this radiation ionizes nearby gas clouds, they also emit their own light signature. With both emissions in range of data gathering telescopes, all that’s needed is a way to measure the time it takes between the radiation signal and the ionization point. The process is called reverberation mapping.
“We use the tight relationship between the luminosity of an AGN and the radius of its broad line region established via reverberation mapping to determine the luminosity distances to a sample of 38 AGN.” says Watson. “All reliable distance measures up to now have been limited to moderate redshift — AGN will, for the first time, allow distances to be estimated to z~4, where variations of dark energy and alternate gravity theories can be probed.”
The team hasn’t taken their research “lightly”. It means careful calculations using known factors and repeating the results with other variables thrown into the mix. Even uncertainty…
“The scatter due to observational uncertainty can be reduced significantly. A major advantage held by AGN is that they can be observed repeatedly and the distance to any given object substantially refined.” explains Watson. “The ultimate limit of the accuracy of the method will rely on how the BLR (broad-line emitting region) responds to changes in the luminosity of the central source. The current tight radius-luminosity relationship indicates that the ionisation parameter and the gas density are both close to constant across our sample.”
At the first standard candle we discovered the Universe was expanding. At the second we learned it was accelerating. Now we’re looking back to just 750 million years after the Big Bang. What will tomorrow bring?
A few days ago, the physics world was turned upside down at the announcement of “faster than the speed of light”. The mighty neutrino has struck again by breaking the cosmic speed limit and traveling at a velocity 20 parts per million above light speed. To absolutely verify this occurrence, collaboration is needed from different sources and we’re here to give you the latest update.
“This result comes as a complete surprise,” said OPERA spokesperson, Antonio Ereditato of the University of Bern. “After many months of studies and cross checks we have not found any instrumental effect that could explain the result of the measurement. While OPERA researchers will continue their studies, we are also looking forward to independent measurements to fully assess the nature of this observation.”
Since the OPERA measurements go against everything we think we know, it’s more important than ever to verify its findings through independent research.
“When an experiment finds an apparently unbelievable result and can find no artifact of the measurement to account for it, it’s normal procedure to invite broader scrutiny, and this is exactly what the OPERA collaboration is doing, it’s good scientific practice,” said CERN Research Director Sergio Bertolucci. “If this measurement is confirmed, it might change our view of physics, but we need to be sure that there are no other, more mundane, explanations. That will require independent measurements.”
To get the job done, the OPERA Collaboration joined forces with CERN metrology experts and other facilities to establish absolute calibrations. There cannot be any error margin in parameters between the source and detector distances – and the neutrino’s flight time. In this circumstance, the measurements of the initial source of the neutrino beam and OPERA has an uncertainty value of 20 cm over the 730 km. The neutrino flight time has an accuracy of less than 10 nanoseconds, and was confirmed through the use of highly regarded GPS equipment and an atomic clock. Every care was given to ensure precision.
“We have established synchronization between CERN and Gran Sasso that gives us nanosecond accuracy, and we’ve measured the distance between the two sites to 20 centimetres,” said Dario Autiero, the CNRS researcher who will give this afternoon’s seminar. “Although our measurements have low systematic uncertainty and high statistical accuracy, and we place great confidence in our results, we’re looking forward to comparing them with those from other experiments.”
“The potential impact on science is too large to draw immediate conclusions or attempt physics interpretations. My first reaction is that the neutrino is still surprising us with its mysteries.” said Ereditato. “Today’s seminar is intended to invite scrutiny from the broader particle physics community.”
It’s been a tenet of the standard model of physics for over a century. The speed of light is a unwavering and unbreakable barrier, at least by any form of matter and energy we know of. Nothing in our Universe can travel faster than 299,792 km/s (186,282 miles per second), not even – as the term implies – light itself. It’s the universal constant, the “c” in Einstein’s E = mc2, a cosmic speed limit that can’t be broken.
That is, until now.
An international team of scientists at the Gran Sasso research facility outside of Rome announced today that they have clocked neutrinos traveling faster than the speed of light. The neutrinos, subatomic particles with very little mass, were contained within beams emitted from CERN 730 km (500 miles) away in Switzerland. Over a period of three years, 15,000 neutrino beams were fired from CERN at special detectors located deep underground at Gran Sasso. Where light would have made the trip in 2.4 thousandths of a second, the neutrinos made it there 60 nanoseconds faster – that’s 60 billionths of a second – a tiny difference to us but a huge difference to particle physicists!
The implications of such a discovery are staggering, as it would effectively undermine Einstein’s theory of relativity and force a rewrite of the Standard Model of physics.
“We are shocked,” said project spokesman and University of Bern physicist Antonio Ereditato.
“We have high confidence in our results. We have checked and rechecked for anything that could have distorted our measurements but we found nothing. We now want colleagues to check them independently.”
Neutrinos are created naturally from the decay of radioactive materials and from reactions that occur inside stars. Neutrinos are constantly zipping through space and can pass through solid material easily with little discernible effect… as you’ve been reading this billions of neutrinos have already passed through you!
The experiment, called OPERA (Oscillation Project with Emulsion-tRacking Apparatus) is located in Italy’s Gran Sasso facility 1,400 meters (4,593 feet) underground and uses a complex array of electronics and photographic plates to detect the particle beams. Its subterranean location helps prevent experiment contamination from other sources of radiation, such as cosmic rays. Over 750 scientists from 22 countries around the world work there.
Ereditato is confident in the results as they have been consistently measured in over 16,000 events over the past two years. Still, other experiments are being planned elsewhere in an attempt to confirm these remarkable findings. If they are confirmed, we may be looking at a literal breakdown of the modern rules of physics as we know them!
“We have high confidence in our results,” said Ereditato. “We have checked and rechecked for anything that could have distorted our measurements but we found nothing. We now want colleagues to check them independently.”
A preprint of the OPERA results will be posted on the physics website ArXiv.org.
Well, we’re off to see the Wizard again, my friends. This time it’s to explore the possibilities of primordial black holes colliding with stars and all the implications therein. If this theory is correct, then we should be able to observe the effects of dark matter first hand – proof that it really does exist – and deeper understand the very core of the Universe.
Are primordial black holes blueprints for dark matter? Postdoctoral researchers Shravan Hanasoge of Princeton’s Department of Geosciences and Michael Kesden of NYU’s Center for Cosmology and Particle Physics have utilized computer modeling to visualize a primordial black hole passing through a star. “Stars are transparent to the passage of primordial black holes (PBHs) and serve as seismic detectors for such objects.” says Kesden. “The gravitational field of a PBH squeezes a star and causes it to ring acoustically.”
If primordial black holes do exist, then chances are great that these type of collisions happen within our own galaxy – and frequently. With ever more telescopes and satellites observing the stellar neighborhoods, it only stands to reason that sooner or later we’re going to catch one of these events. But, the most important thing is simply understanding what we’re looking for. The computer model developed by Hanasoge and Kesden can be used with these current solar-observation techniques to offer a more precise method for detecting primordial black holes than existing tools.
“If astronomers were just looking at the Sun, the chances of observing a primordial black hole are not likely, but people are now looking at thousands of stars,” Hanasoge said.”There’s a larger question of what constitutes dark matter, and if a primordial black hole were found it would fit all the parameters — they have mass and force so they directly influence other objects in the Universe, and they don’t interact with light. Identifying one would have profound implications for our understanding of the early Universe and dark matter.”
Sure. We haven’t seen DM, but what we can see are galaxies that are hypothesized to have extended dark-matter halos and to study the effects the gravity has on their materials – like gaseous regions and stellar members. If these new models are correct, primordial black holes should be heavier than existing dark matter and when they collide with a star, should cause a rippling effect.
“If you imagine poking a water balloon and watching the water ripple inside, that’s similar to how a star’s surface appears,” Kesden said. “By looking at how a star’s surface moves, you can figure out what’s going on inside. If a black hole goes through, you can see the surface vibrate.”
Using the Sun as a model, Kesden and Hanasoge calculated the effects a PBH might have and then gave the data to NASA’s Tim Sandstrom. Using the Pleiades supercomputer at the agency’s Ames Research Center in California, the team was then able to create a video simulation of the collisional effect. Below is the clip which shows the vibrations of the Sun’s surface as a primordial black hole — represented by a white trail — passes through its interior.
“It’s been known that as a primordial black hole went by a star, it would have an effect, but this is the first time we have calculations that are numerically precise,” comments Marc Kamionkowski, a professor of physics and astronomy at Johns Hopkins University. “This is a clever idea that takes advantage of observations and measurements already made by solar physics. It’s like someone calling you to say there might be a million dollars under your front doormat. If it turns out to not be true, it cost you nothing to look. In this case, there might be dark matter in the data sets astronomers already have, so why not look?”
Radioactive decay – a random process right? Well, according to some – maybe not. For several years now a team of physicists from Purdue and Stanford have reviewed isotope decay data across a range of different isotopes and detectors – seeing a non-random pattern and searching for a reason. And now, after eliminating all other causes – the team are ready to declare that the cause is… extraterrestrial.
OK, so it’s suggested to just be the Sun – but cool finding, huh? Well… maybe it’s best to first put on your skeptical goggles before reading through anyone’s claim of discovering new physics.
Now, it’s claimed that there is a certain periodicity to the allegedly variable radioactive decay rates. A certain annual periodicity suggests a link to the varying distance from the Sun to the Earth, as a result of the Earth’s elliptical orbit – as well as there being other overlying patterns of periodicity that may link to the production of large solar flares and the 11 year (or 22 year if you prefer) solar cycle.
However, the alleged variations in decay rates are proportionally tiny and there remain a good deal of critics citing disconfirming evidence to this somewhat radical idea. So before drawing any conclusions here, maybe we need to first consider what exactly good science is:
• Replication – a different laboratory or observatory can collect the same data that you claim to have collected.
• A signal stronger than noise – there is a discrete trend existent within your data that has a statistically significant difference from the random noise existent within your data.
• A plausible mechanism – for example, if the rate of radioactive decay seems to correlate with the position and magnetic activity of the Sun – why is this so?
• A testable hypothesis – the plausible mechanism proposed should allow you to predict when, or under what circumstances, the effect can be expected to occur again.
The proponents of variable radioactive decay appeal to a range of data sources to meet the replication criterion, but independent groups equally appeal to other data sources which are not consistent with variable radioactive decay. So, there’s still a question mark here – at least until more confirming data comes in, to overwhelm any persisting disconfirming data.
Whether there is a signal stronger than noise is probably the key point of debate. The alleged periodic variations in radioactive decay are proportionally tiny variations and it’s not clear whether a compellingly clear signal has been demonstrated.
An accompanying paper outlines the team’s proposed mechanism – although this is not immediately compelling either. They appeal to neutrinos, which are certainly produced in abundance by the Sun, but actually propose a hypothetical form that they call ‘neutrellos’, which necessarily interact with atomic nuclei more strongly than neutrinos are considered to do. This creates a bit of a circular argument – because we think there is an effect currently unknown to science, we propose that it is caused by a particle currently unknown to science.
So, in the context of having allegedly found a periodic variability in radioactive decay, what the proponents need to do is to make a prediction – that sometime next year, say at a particular latitude in the northern hemisphere, the radioactive decay of x isotope will measurably alter by z amount compared to an equivalent measure made, say six months earlier. And maybe they could collect some neutrellos too.
If that all works out, they could start checking the flight times to Sweden. But one assumes that it won’t be quite that easy.
Surprisingly, rumors still persist in some corners of the Internet that the Large Hadron Collider (LHC) is going to destroy the Earth – even though nearly three years have passed since it was first turned on. This may be because it is yet to be ramped up to full power in 2014 – although it seems more likely that this is just a case of moving the goal posts, since the same doomsayers were initially adamant that the Earth would be destroyed the moment the LHC was switched on, in September 2008.
The story goes that the very high energy collisions engineered by the LHC could jam colliding particles together with such force that their mass would be compressed into a volume less than the Schwarzschild radius required for that mass. In other words, a microscopic black hole would form and then grow in size as it sucked in more matter, until it eventually consumed the Earth.
Here’s a brief run-through of why this can’t happen.
1. Microscopic black holes are implausible.
While a teaspoon of neutron star material might weigh several million tons, if you extract a teaspoon of neutron star material from a neutron star it will immediately blow out into the volume you might expect several million tons of mass to usually occupy.
Notwithstanding you can’t physically extract a teaspoon of black hole material from a black hole – if you could, it is reasonable to expect that it would also instantly expand. You can’t maintain these extreme matter densities outside of a region of extreme gravitational compression that is created by the proper mass of a stellar-scale object.
The hypothetical physics that might allow for the creation of microscopic black holes (large extra dimensions) proposes that gravity gains more force in near-Planck scale dimensions. There is no hard evidence to support this theory – indeed there is a growing level of disconfirming evidence arising from various sources, including the LHC.
High energy particle collisions involve converting momentum energy into heat energy, as well as overcoming the electromagnetic repulsion that normally prevents charged particles from colliding. But the heat energy produced quickly dissipates and the collided particles fragment into sub-atomic shrapnel, rather than fusing together. Particle colliders attempt to mimic conditions similar to the Big Bang, not the insides of massive stars.
2. A hypothetical microscopic black hole couldn’t devour the Earth anyway.
Although whatever goes on inside the event horizon of a black hole is a bit mysterious and unknowable – physics still operates in a conventional fashion outside. The gravitational influence exerted by the mass of a black hole falls away by the inverse square of the distance from it, just like it does for any other celestial body.
The gravitational influence exerted by a microscopic black hole composed of, let’s say 1000 hyper-compressed protons, would be laughably small from a distance of more than its Schwarzschild radius (maybe 10-18 metres). And it would be unable to consume more matter unless it could overcome the forces that hold other matter together – remembering that in quantum physics, gravity is the weakest force.
It’s been calculated that if the Earth had the density of solid iron, a hypothetical microscopic black hole in linear motion would be unlikely to encounter an atomic nucleus more than once every 200 kilometres – and if it did, it would encounter a nucleus that would be at least 1,000 times larger in diameter.
So the black hole couldn’t hope to swallow the whole nucleus in one go and, at best, it might chomp a bit off the nucleus in passing – somehow overcoming the strong nuclear force in so doing. The microscopic black hole might have 100 such encounters before its momentum carried it all the way through the Earth and out the other side, at which point it would probably still be a good order of magnitude smaller in size than an uncompressed proton.
And that still leaves the key issue of charge out of the picture. If you could jam multiple positively-charged protons together into such a tiny volume, the resultant object should explode, since the electromagnetic force far outweighs the gravitational force at this scale. You might get around this if an exactly equivalent number of electrons were also added in, but this requires appealing to an implausible level of fine-tuning.
3. What the doomsayers say
When challenged with the standard argument that higher-than-LHC energy collisions occur naturally and frequently as cosmic ray particles collide with Earth’s upper atmosphere, LHC conspiracy theorists refer to the high school physics lesson that two cars colliding head-on is a more energetic event than one car colliding with a brick wall. This is true, to the extent that the two car collision has twice the kinetic energy as the one car collision. However, cosmic ray collisions with the atmosphere have been measured as having 50 times the energy that will ever be generated by LHC collisions.
In response to the argument that a microscopic black hole would pass through the Earth before it could achieve any appreciable mass gain, LHC conspiracy theorists propose that an LHC collision would bring the combined particles to a dead stop and they would then fall passively towards the centre of the Earth with insufficient momentum to carry them out the other side.
This is also implausible. The slightest degree of transverse momentum imparted to LHC collision fragments after a head-on collision of two particles travelling at nearly 300,000 kilometres a second will easily give those fragments an escape velocity from the Earth (which is only 11.2 kilometres a second, at sea-level).
The nature of the highly compressed matter that makes up neutron stars has been the subject of much speculation. For example, it’s been suggested that under extreme gravitational compression the neutrons may collapse into quark matter composed of just strange quarks – which suggests that you should start calling a particularly massive neutron star, a strange star.
However, an alternate model suggests that within massive neutron stars – rather than the neutrons collapsing into more fundamental particles, they might just be packed more tightly together by adopting a cubic shape. This might allow such cubic neutrons to be packed into about 75% of the volume that spherical neutrons would normally occupy.
Some rethinking about the internal structure of neutron stars has been driven by the 2010 discovery that the neutron star PSR J1614–2230, has a mass of nearly two solar masses – which is a lot for a neutron star that probably has a diameter of less than 20 kilometres.
PSR J1614–2230, described by some as a ‘superheavy’ neutron star, might seem an ideal candidate for the formation of quark matter – or some other exotic transformation – resulting from the extreme compression of neutron star material. However, calculations suggest that such a significant rearrangement of matter would shrink the star’s volume down to less than the Schwarzschild radius for two solar masses – meaning that PSR J1614–2230 should immediately form a black hole.
But nope, PSR J1614–2230 is there for all to observe, a superheavy neutron star, which is hence almost certainly composed of nothing more exotic that neutrons throughout, as well as a surface layer of more conventional atomic matter.
Nonetheless, stellar-sized black holes can and do form from neutron stars. For example, if a neutron star in a binary system continues drawing mass of its companion star it will eventually reach the Tolman–Oppenheimer–Volkoff limit. This is the ultimate mass limit for neutron stars – similar in concept to the Chandrasekhar limit for white dwarf stars. Once a white dwarf reaches the Chandrasekhar limit of 1.4 solar masses it detonates as a Type 1a supernova. Once, a neutron star reaches the Tolman–Oppenheimer–Volkoff mass limit, it becomes a black hole.
Due to our current limited understanding of neutron star physics, no-one is quite sure what the Tolman–Oppenheimer–Volkoff mass limit is, but it is thought to lie somewhere between 1.5 – 3.0 solar masses.
So, PSR J1614–2230 seems likely to be close to this neutron star mass limit, even though it is still composed of neutrons. But there must be some method whereby a neutron star’s mass can be compressed into a smaller volume, otherwise it could never form a black hole. So, there should be some intermediary state whereby a neutron star’s neutrons become progressively compressed into a smaller volume until the Schwarzschild radius for its mass is reached.
Llanes-Estrada and Navarro propose that this problem could be solved if, under extreme gravitational pressure, the neutrons’ geometry became deformed into smaller cubic shapes to allow tighter packing, although the particles still remain as neutrons.
So if it turns out that the universe does not contain strange stars after all, having cubic neutron stars instead would still be agreeably unusual.
Further reading: Llanes-Estrada and Navarro. Cubic neutrons.
Universe Today: You’ve been really busy, with writing books, filming two television series and DVDs. Do you have time to do research in particle physics as well?
Brian Cox: Well, I must say I’ve been a bit restricted over the past couple of years in how much research I’ve done. I’m still attached to the experiment at CERN, but it’s just one of those things! In many ways it’s a regret because I would love to be there full time at the moment because it is so genuinely exciting. We’re making serious progress and we’re going to discover something like the Higgs particle, I would guess, within the next 12 months.
But then again, you can’t do everything and it’s a common regret amongst academics, actually, that that as they get older, they get taken away from the cutting-edge of research if they’re not careful! But I suppose it is not a bad way to be taken away from the cutting edge, to make TV programs and push this agenda that I have to make science more relevant and popular.
UT : Absolutely! Outreach and educating the public is very important, especially in the area of research you are in. I would guess a majority of the general public are not exceptionally well-versed in particle physics.
Cox: Well, Carl Sagan is a great hero of mine and he used to say it is really about teaching people the scientific method – or actually providing the understanding and appreciation of what science is. We look at these questions, such as what happened just after the Universe began, or why the particles in the Universe have mass – they are very esoteric questions.
But the fact that we’ve been able build some reasonable theories about the how old universe is — and we have a number 13.73 ± 0.12 billion years old, quite a precise number — so the question of showing how you get to those quite remarkable conclusions is very important. When you look at what we might call more socially-important subjects – for example how to respond to global warming, or what should be our policy for vaccinating the population against disease, or how should we produce energy in the future, and if you understand what the scientific method is and that it is apolitical and a-religious and it is a-everything and there is no agenda there, and is just pure way of looking of universe, that’s the important thing for society to understand.
“Wonders of the Universe” is a book about the television series. Traditionally these books are quite ‘coffee table,’ image-heavy books. The filming of the series took longer than we anticipated, so actually the book got written relatively quickly because I had time to sit down and really just write about the physics. Although it is tied with the television series, it does go quite a lot deeper in many areas. I’m quite pleased about that. So it’s more than just snapshots of my view of the physics of the TV series.
I should say also, some parts of it are in the form of a diary of what it was like filming the TV series. There are always some things you do and places you go that have quite an impact on you. And I tend to take a lot of pictures so many of the photographs in the book are mine. So, it is written on two levels: It is a much deeper view of the physics of the television series, but secondly it is a diary of the experience of filming the series and going to those places.
(Editor’s note, Cox is also just finishing a book on quantum mechanics, so look for that in the near future)
UT : What were some of your best experiences while filming ‘Wonders?’
Cox: One thing that, well, I wouldn’t say enjoyed filming, because it was quite nerve-wracking – but something that really worked was the prison demolition sequence in Rio. We used it as an analog for a collapsing star, a star at the end of its life that has run out of fuel and it collapses under its own gravity. It does that in a matter of seconds, on the same timescale as a building collapses when you detonate it.
Wandering around a building that is full of live dynamite and explosives is not very relaxing! It was all wired up and ready to go. But when we blew it up, and I thought it really worked well, and I enjoyed it a lot, actually as a television piece.
The ambition of the series is to try and get away from using too many graphics, if possible. You obviously have to use some graphics because we are talking about quite esoteric concepts, but we tried to put these things ‘on Earth’, by using real physical things to talk about the processes. What we did, we went inwards into the prison and at each layer we said, here’s where the hydrogen fuses to helium, and here’s the shell where helium goes to carbon and oxygen, and another shell all the way down to iron at the center of the stars. That’s the way stars are built, so we used this layered prison to illustrate that and then collapse it. That’s a good example of what the ambition of the series was.
UT : You’ve been called a rock star in the physics and astronomy field but in actuality you did play in a rock band before returning to science. What prompted that shift in your career?
Cox: I always wanted to be a physicist or astronomer from as far back as I can remember, that was always my thing when I was growing up. I got distracted when I was in my teens, or interested I should say, in music and being in a band. The opportunity came to join a band that was formed by an ex-member of Thin Lizard, a big rock band in the UK, and the States as well, so I did that. We made two albums; we toured with lots of people. That band split up and I went to university and then joined another band as a side line, and that band got successful as well. That was two accidents, really! It was a temporary detour rather than a switch, because I always wanted to do physics.
UT : Thanks for taking the time to talk with us on Universe Today – we appreciate all the work you do in making science more accessible so everyone can better appreciate and understand how it impacts our lives.