[/caption]
According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in scientific literature. Being Universe Today if we occasionally stray into critically evaluating each other’s critical evaluations, that’s OK too. And of course, the first rule of Journal Club is… don’t talk about Journal Club.
So, without further ado – today’s journal article on the dissection table is about using our limited understanding of dark matter to attempt visualise the cosmic web of the very early universe.
Today’s article:
Visbal et al The Grand Cosmic Web of the First Stars.
So… dark matter, pretty strange stuff huh? You can’t see it – which presumably means it’s transparent. Indeed it seems to be incapable of absorbing or otherwise interacting with light of any wavelength. So dark matter’s presence in the early universe should make it readily distinguishable from conventional matter – which does interact with light and so would have been heated, ionised and pushed around by the radiation pressure of the first stars.
This fundemental difference may lead to a way to visualise the early universe. To recap those early years, first there was the Big Bang, then three minutes later the first hydrogen nuclei formed, then 380,000 years later the first stable atoms formed. What follows from there is the so-called dark ages – until the first stars began to form from the clumping of cooled hydrogen. And according to the current standard model of Lambda Cold Dark Matter – this clumping primarily took place within gravity wells created by cold (i.e. static) dark matter.
This period is what is known as the reionization era, since the radiation of these first stars reheated the interstellar hydrogen medium and hence re-ionized it (back into a collection of H+ ions and unbound electrons).
While this is all well established cosmological lore – it is also the case that the radiation of the first stars would have applied a substantial radiation pressure on that early dense interstellar medium.
So, the early interstellar medium would not only be expanding due to the expansion of the universe, but also it would be being pushed outwards by the radiation of the first stars – meaning that there should be a relative velocity difference between the interstellar medium and the dark matter of the early universe – since the dark matter would be immune to any radiation pressure effects.
To visualize this relative velocity difference, we can look for hydrogen emissions, which are 21 cm wavelength light – unless further red-shifted, but in any case these signals are well into the radio spectrum. Radio astronomy observations at these wavelengths offer a window to enable observation of the distribution of the very first stars and galaxies – since these are the source of the first ionising radiation that differentiates the dark matter scaffolding (i.e. the gravity wells that support star and galaxy formation) from the remaining reionized interstellar medium. And so you get the first signs of the cosmic web when the universe was only 200 million years old.
Higher resolution views of this early cosmic web of primeval stars, galaxies and galactic clusters are becoming visible through high resolution radio astronomy instruments such as LOFAR – and hopefully one day in the not-too-distant future, the Square Kilometre Array – which will enable visualisation of the early universe in unprecedented detail.
So – comments? Does this fascinating observation of 21cm line absorption lines somehow lack the punch of a pretty Hubble Space Telescope image? Is radio astronomy just not sexy? Want to suggest an article for the next edition of Journal Club?
Interesting reading further about the SKA radio telescope project. Anyone know offhand, why the apparent randomness of placement of the gregorian dishes within the confines of the central core of the array?
That was a _very_ good question. By the way, the Wp link mentions that it is purposely random placing of all antennas in a central core and mid region, whereupon logarithmic spirals placement is used.
I had to google up this analysis. I gather the situation is this:
– You don’t pay for a humongous antenna area.
– The dispersal of antenna area is called thinning. Thinning introduces side lobes to the beam width of the antennas, i.e. it places a filter on the received signal.
– When you go towards lower zenith angles so that antennas start to shadow each other, a regular array gives a slightly more efficient signal to noise ratio for the general case.
– However, regular placement introduces spurious correlations between signals. A lot of energy goes into the side lobes of a much thinned array. Randomization minimizes (I think) and spreads the effect.
By the way, when I did that I stumbled onto the fact that the SKA physically extended processor network, SkyNet, goes into action, not 2012, but 2018. We are then DISHED! Or domed, whatever.
Thanks Torb~
SkyNet eh? Frightening! but at least it will give the doom (or dome) sayers something to go on when nothing happens in 2012! “Oh, just wait for 2018!”
I am expecting someone will come up with a wacky notion for how the combined SKA data flows (80 gigabytes per second per antenna) will generate an information vortex sufficient to consume the Earth.
Or perhaps some equally wacky magnetic resonance scenario – let’s face it, putting a lot of electrified metal out in the middle of a desert is just asking for trouble 🙂
Such stories will probably appear around the time the LHC (another big science machine) has done everything it could possibly do to produce an Earth-consuming black hole – without actually producing one.
As much as I understand it, radio astronomy was very sexy once. Some of what the SKA offers (cosmology, early universe, astrobiology) is certainly sexy today!
I gave this paper a first read through. It models the relative properties of dark matter and luminous matter with radiation pressure from the first stars. I might conjecture this might predict a preponderance of hydrogen in intergalactic space over dark matter. Of course I am not sure how one would detect dark matter in intergalactic space, particularly if it is evenly and thinly dispersed.
It is interesting to see they are simulating the universe at z ~ 65. This exceeds the linear v = Hd Hubble rule. Scale factors evolve as (a’/a)^2 = 8?G?/3 = ?/3, which has a solution
a(t) = a_0exp(sqrt{?/3}t).
For v = 65c (that is a whopping 19.5 million km per second) the distance is
d ~ ln(v/c)sqrt{3/?} ~ 4.17×10^{10}ly
or about 42 billion light years away. The CMB limit has v/c ~ 1000 put it at 69 billion light years out.
Of course this is one of those things which stun people. How can something move faster than light. Further, the cosmological horizon, similar to a black hole event horizon, is about 10^{10}light years away and yet we see things beyond this distance. Any galaxy observed with a redshift factor z > 1 is moving faster than light. These huge velocities that are faster than light are due to the expansionary dynamics of space, not due to the relative motion of bodies in flat space that is static. The cosmological horizon is similar to a black hole horizon, but from the perspective of an observer looking out, not from the perspective of an exterior observer looking at a black hole. So we can observe a very distant galaxy beyond the horizon, but we can never send a signal to it. This is similar to the situation an observer in a black hole faces, where that observer can’t ever send a signal to the exterior.
LC
Hi LC.
May I ask what is your opinion of Dragan Hajdukovic’s Paper submission of June 4th last in which dark matter is explained as gravitational polarization of the quantum vacuum by baryonic matter?
http://arxiv.org/pdf/1106.0847v2
If I may: it is a rewritten MOND alternative.
MOND can’t predict any dark matter effects outside of galaxies, from the CMB to famously galaxy clusters and their collisions. This is due to the main two problems of that it doesn’t admit the observed cold particulate nature of DM, and that it erroneously binds its effects to baryonic matter.
With the recent Eris simulation MOND is now far inferior to DM in its former last hold out. Eris shows that DM is needed to predict spiral galaxies correctly, and in the process many specific predictions are made.
MOND is finally dead.
Again many thanks Torbjorn, Hajdukovic’s viewpoint generated public interest when it was presented in “National Geographic” on August 31st last leading one wag to comment to this classic effect:
“I know my cat weighs 10 pounds, but when I weigh my cat it weighs 20 pounds. I know my cat weighs 10 pounds so the measurement I am getting must be wrong….. there must be an additional invisible cat clinging to the back of may 10 pound cat.”
My initial response to this is skeptical. There is a gravitational polarization of the vacuum that occurs with gravity fields. The polarization is a measure of the Weyl curvature. Since gravity is very weak it requires considerable curvature to generate such polarizations. This can happen interior to black holes and such. It is problematic whether this can occur with ordinary baryonic matter at standard densities.
LC
Excellent analysis LC thanks, always enlightening to have your mathematical prowess employed.
Very sad news about the death of Steve Rawling who was significant in the SKA project. The sudden and >c expansion of space (into what?) is one of the theories that does not sit comfortably with a lot of people but the comparison of the cosmological horizon with the event horizon of a black hole is an interesting thought. An observer inside a black hole (physical effects notwithstanding) would be seeing the exact opposite of what we see, a universe contracting at an accelerated rate towards singularity
You say ‘ Further, the cosmological horizon, similar to a black hole event horizon, is about 10^{10}light years away and yet we see things beyond this distance’.
I’m not sure this is accurate (though maybe my terminology is confused). Quoting from: http://en.wikipedia.org/wiki/Observable_universe
‘The particle horizon differs from the cosmic event horizon in that the particle horizon represents the largest comoving distance from which light could have reached the observer by a specific time (i.e. the 92 billion light year observable universe), while the (cosmic) event horizon is the largest comoving distance from which light emitted now can ever reach the observer in the future. At present, this cosmic event horizon is thought to be at a comoving distance of about 16 billion light years.’
Your statement that ‘yet we still see things beyond this distance’ is only true to the extent that we see things as they looked up to (say) 13 billion years ago. It is quite likely that the things we ‘see’ at this distance no longer exist – due to galactic mergers, supernovae explosions, black hole accretions etc, etc, etc. I’m not sure that this complex scenario is analagous to a black hole event horizon.
This is correct and iterates what I indicated. I did not discuss in detail the particle horizon
d_h = ?dt/a(t)
in the FLRW energy equation
(a’/a)^2 = 8?G?/3 – k/a^2
where a’ = da/dt. This integral is the conformal time which is the particle horizon. The conformal time is the time since the big bang expanded by the conformal flow of the space in its evolution. This conformal flow defines at the limits of this integral [t_0, t] (t_0 the fiducial start time and t the current time) the commoving distance of the universe, which is larger in an expanding universe than the distance computed from the time t multiplied by the speed of light.
As a result the distance limit for observing particles in the distant past can be far larger than the naïve distance computed from the age of the universe. Inflation in the past implies a much larger vacuum density ?, which expand out the past horizon enormously. This implies that if we develop the techniques for neutrino astronomy or gravity wave interferometer telescopes we could in principle observe the universe far further into the past than the 46 bly radius computed from the “coasting” FLRW state of the universe from the end of the radiation dominated period. The CMB is an opaque boundary to optical observation and demarks this radius.
The cosmological event horizon marks the boundary beyond which we can’t send a signal. Any galaxy with z > 1 is such that we may never now be able to send a signal to that galaxy which observers there in the future could detect. This is a bit like trying to send a signal to the exterior world from the interior of a black hole. That galaxy on the current Hubble frame, a spatial surface of clock simultaneity for commoving galaxies, could have observers which do observe our galaxy in its distant past, just as we can observe that galaxy now.
It is the case that we observe galaxies up to about 13 billion years into the past. On the Hubble frame, a spatial slice of commoving galaxies, these distant objects have subsequently evolved into more quiescent objects, like our Milky Way galaxy. Even the rather active M81 (or is it M82?) galaxy is a mild object compared to the violent quasars of billions of years ago or titanic explosions of PopIII stars seeding nascent galaxies and so forth.
LC
“Scale factors evolve as (a’/a)^2 = 8?G?/3 = ?/3, which has a solution
a(t) = a_0exp(sqrt{?/3}t).”
I’m just an amateur so apologies if I’m being dense but isn’t it:
(a’/a)^2 = (8?G? + ?)/3
http://en.wikipedia.org/wiki/Friedmann_equations#The_equations
I may be getting confused over the symbols though.
I understood the exponential solution was applicable when dark energy dominated but in that era the universe was matter-dominated so the solution then would be:
a(t) = a_0 t^(2/3)
with a gradual switch to dark energy dominated around 8 billion years ago.
At times when the vacuum energy density dominates the cosmological constant can be computed from that energy density
? = 8?G?/3
for ? the quantum expectation of the vacuum energy. During the inflationary period of the universe ? ~ 10^{72}GeV^4. The current state of the universe is also inflationary, but much milder with ? ~ 10^{-48}GeV^4. The end of the early inflationary period was one of active particle generation, which lead to the radiation and then matter dominated periods. During these periods radiation and matter had larger energy density than the vacuum. During these epochs, now past, the universe evolved by these fractional polynomial rules.
LC
Whatever we will discuss, just make my head hurt. 😀
You say ‘ Further, the cosmological horizon, similar to a black hole event horizon, is about 10^{10}light years away and yet we see things beyond this distance’.
I’m not sure this is accurate (though maybe my terminology is confused). Quoting from: http://en.wikipedia.org/wiki/Observable_universe
‘The particle horizon differs from the cosmic event horizon in that the particle horizon represents the largest comoving distance from which light could have reached the observer by a specific time (i.e. the 92 billion light year observable universe), while the (cosmic) event horizon is the largest comoving distance from which light emitted now can ever reach the observer in the future. At present, this cosmic event horizon is thought to be at a comoving distance of about 16 billion light years.’
Your statement that ‘yet we still see things beyond this distance’ is only true to the extent that we see things as they looked up to (say) 13 billion years ago. It is quite likely that the things we ‘see’ at this distance no longer exist – due to galactic mergers, supernovae explosions, black hole accretions etc, etc, etc. I’m not sure that this complex scenario is analogous to a black hole event horizon.
You say ‘ Further, the cosmological horizon, similar to a black hole event horizon, is about 10^{10}light years away and yet we see things beyond this distance’.
I’m not sure this is accurate (though maybe my terminology is confused). Quoting from: http://en.wikipedia.org/wiki/Observable_universe
‘The particle horizon differs from the cosmic event horizon in that the particle horizon represents the largest comoving distance from which light could have reached the observer by a specific time (i.e. the 92 billion light year observable universe), while the (cosmic) event horizon is the largest comoving distance from which light emitted now can ever reach the observer in the future. At present, this cosmic event horizon is thought to be at a comoving distance of about 16 billion light years.’
Your statement that ‘yet we still see things beyond this distance’ is only true to the extent that we see things as they looked up to (say) 13 billion years ago. It is quite likely that the things we ‘see’ at this distance no longer exist – due to galactic mergers, supernovae explosions, black hole accretions etc, etc, etc. I’m not sure that this complex scenario is analogous to a black hole event horizon.
fix
I think there is a bug in the system here. This happened to me as well. You try to post something and there is an error message box which appears. You then try again and the same error happens. The problem is that the message is getting posted and every time you try to post to get around the error multiple copies of it show up.
LC
Thanks – sorry. Now just repeated post saying ‘fix’. Agree there’s a bug.
There may be other purposes and reasons for dark matter to exist. It may have had a completely different role than as elucidated in this article. Here in one of many possible purposes/reasons for existence of dark matter…
My theory on the birth of the Universe – as an emergent phenomenon from self-organizing dark matter – http://ydessays.blogspot.com/2012/01/case-for-emergence-as-cause-of-birth-of.html
See into- Early Universe- at http://bit.ly/ceGjUE finally sees Digital Universe at http://t.co/nsND5lSm found a-Multimedia DEMO