Things academics say

Here is a short compilation of quotes from cosmologists and astrophysics giving talks in our Department or during conferences. Some are brutal.

“All these detections of cores are, to put it mildly, not robust ”                           Carlos Frenk

“Does anybody else here work on cores? … If so, stop working on that! ”       Carlos Frenk

“To be fair, if we’d spend all that money on THIS question, we could tell you.” – “Well you should! Is there anything more important?”                                                           Carlos Frenck and Matt Auger

During talk, George facepalms when a plot from paper X is shown. At the end he puts his hand up. “At the start, you mentionned a… tension. What do you think the situation is?” – “… well …  there is a tension?” – ” (interrupting) that paper … (pauses to think) … gets my vote for the worst paper written this year so far.” – long, awkward silence. students snicker. – speaker: “… do you want to ellaborate?” – “… yes, I CAN ellaborate.” Proceeds to wreck paper X.

Comment at end of talk: “Well, mostly it’s the color scheme chosen to confuse the audience”

Raises hand: “I don’t believe any of it.”

“But I discussed high velocity winds! I mentionned them just so I could… squash the claim!”                                      Ann Zabludoff

“Never don’t use a model. If a model is wrong, the data will remove it for you.”         David Hogg

Question from audience: “Is that supposed to be the message? That this really doesn’t work?” … it was not.

“To be able to convince yourself, and to convince other people, that you’re physically motivated”                                                                     Matthew Smith

 

Large-scale fluctuations in the cosmic ionising background: the impact of beamed source emission

In 2015, a paper published by George Becker caused a small revolution in the field of Reionisation. The process of Reionisation is what enabled the Universe to become transparent to starlight by destroying the neutral hydrogen left over litterally all over the place by the Big Bang. In that paper, it was discovered that this ionisation of the Universe is a lot let homogeneous than previously thought, with large patches of the Universe still quite neutral while others are already transparent. Because (proto-)galaxies are very common and very homogenously distributed across the Universe at the time, it seems like they cannot be the main actors in destroying the neutral hydrogen, otherwise it would proceed far more smoothly. Instead, it was suggested that quasars, which are extremely rare, extremely powerful accreting supermassive black holes, are responsible. In this interpretation, the large variations in neutrality are due to the rarity of the sources which are capable of ionising the gas. (although, a few research groups have claimed that galaxies can still make it happen on their own. this is definitely hotly debated, but the ‘rare sources’ hypothesis seems more popular in Cambridge, UK)

This hypothesis has been tested in simulations of Reionisation and it does appear to work quite well (although 2 other competing models do too). But simulations at the moment don’t include enough physical effects to make sure that the argument is water-tight. The black hole itself is not simulated, instead ‘source particles’ are used; and the smallest units of gas are sizes of 10000 stars or more. This is necessary because otherwise the simulations simply couldn’t be run, and the approximations are not as bad as it sounds. However, it is always good to add more physics.

In this paper, a team from UCL tests one of those previously-ignored side-effects of the ‘quasars did it’ scenario: beaming. Previous simulations all assumed that ionising photons escape from quasars in all directions, but in reality they travel down jet-like funnels with an opening of roughly 30 degrees. This will (should!) change the power spectrum of ionisation considerably. The power spectrum basically tells you on what scales things are non-homogenous/correlated, or in other words, the typical spacing between ‘things’. This is a bad explanation.

The paper is quite mathsy so let’s skip straight to the results. I made this schematic diagram above. The only important lines are the red and the thick black.

The important thing here is that if the quasar funnels are very small (thick black) then the only ones we can see are pointing directly at us, therefore there are a lot more of them than the ones we detect. To match the total number of photons needed for Reionisation (which we know) each quasar has to be individually weaker. The red line shows what happens in the bad assumption that there is no beaming.

In region I, we are looking at small scales. The power in small scales is boosted a lot by the variation around each quasar (the size of the beam is small), but because the quasars are weaker, the effect cancels (the contrast between being close/far from the quasars is weaker). Therefore the models roughly agree in I, only a 5-15% difference.

In region II, the very large scales, the distribution is dominated by the density field because that’s where the ‘stuff’ (everything!) is, including the quasars. If the quasars are smaller and more numerous, their effect is more smeared out and the variation on the largest scales is decreased overall.

In region III (intermediate scales) , something weird happens because neutral gas also follows the density field, so strangely enough the distribution is completely featureless at some scale. This is worse the smaller the beams are, because it happens in more places.

Unfortunately the only scales we can observe with currently existing instruments are the smallest ones, where the effect of including beams is the weakest. This is bad luck because we most likely won’t be able to rule one way or the other.

Paper: ads

Spectroscopic confirmation of an ultra-faint galaxy at the epoch of reionization

This paper reports the discovery of the faintest-yet Lyman-alpha emitter during the Epoch of Reionisation. It is found at redshift 7.64, just 680 million years after the Big Bang.

It has been a bit of a puzzle that the search for Lyman-alpha emitters (i.e., groups of stars) has been quite succesful for bright objects, but very few faint objects have found, even though our models predict that they exist and our instruments are sensitive enough. A theory developed to explain this dubbed ‘inside-out Reionisation’: basically, the ‘ionised bubbles’ around the smallest, faintest objects have smaller diameters. When the light from the galaxies hits the ‘edge’ of their local ionised bubble, it has redshifted due to cosmic expansion, but not enough (it’s still ‘resonantly absorbed’ by the neutral hydrogen). This would mean there is a critical ‘ionising power’ of object below which we can’t find Lyman-alpha emitters.

Fortunately, just finding 1 of those ultra-faint objects doesn’t contradict the theory. This is because the area over which this galaxy was detected is extremely small, smaller than the expected typical size of an ionised bubble. There could easily be a large, more powerful galaxy nearby which has done the heavy lifting of carving a bug enough bubble for this ultra-faint blob to be visible.

There are also signs that the radiation from this galaxy blob is very high energy, as demonstrated by excess light in the red optical which could be due to OIII. Like the previous paper pointed out in a local setting, galaxy blobs can have strong ionising field (Ly-C) without displaying very strong Lyman-alpha.

This system was detected via lensing by a large foreground cluster of galaxies. These authors’ previous paper laid out the candidate selection and narrowed it down to 9, of which 2 were observed, and 1 was not a faint primordial galaxy.

Link to article: arxiv

Haro 11: Where is the Lyman continuum source?

This paper contributes to solving a long-standing mystery related to the origin of Lyman-alpha, Lyman-continuum, and high O III radiation in small size, young objects.

Small-size, young objects were all that existed at the time of Reionisation, so they must be responsible for the process in some way. The photons capable of ionising hydrogen come from 2 sources: Lyman-continuum radiation, or from individual strong X-ray sources (accreting black holes or extreme star binaries). Currently, the total from both these sources appears to be insufficient.

Lyman-continuum is produced whenever stars are present (especially, it seems, young stars) but it struggles to escape from the galaxy. Models of Reionisation only require ~20% of the radiation which we know is being produced to escape (the ‘escape fraction’), but observations of local galaxies in ideal conditions have consistently reported much, much smaller values (Ly-C is extremely rarely seen). Another type of radiation, Lyman-alpha, is (believed to be) produced in nearly identical contexts as Lyman-C, and is extremely common. This has led to the hypothesis that the escape of Lyman-C is not as much rare as extremely inhomogeneous: all of it escapes through a ‘gap’ from the galaxy. Our lack of detection of it is because the gap is tiny and facing away from Earth, but we know Lyman-C must be there because there is so much Lyman-alpha (and Lyman-alpha emission is not nearly as inhomogeneous). Problem: early galaxies don’t even show very much Lyman-alpha.

Various other mechanisms have been proposed. For instance, AGN radiation could be much more efficient at escaping galaxies; unfortunately AGN are extremely rare in the early universe (or maybe they don’t look like what we expect, and we haven’t found them yet). Or, AGN beams could carve the holes through which Ly-C then escapes.

This paper provides a ‘field test’ from some of these radiation processes in a local, well-understood situation. It looks at system Haro 11, the most-easily studied Ly-C. Haro 11 formed very recently (<50 My), and is full of young, low-metalicity (-ish) giant stars. It is also divided quite neatly into 3 separate blobs, all with different properties:

  • Blob C is the oldest (the WR stars have started dissapearing) and is a strong Lyman-alpha emitter, AND shows signs of containing an AGN.
  • Blob B shows signs of containing a strong AGN, emits X-rays, is young and shows only weak Lyman-alpha.
  • Blob A is young but shows (very) weak Lyman-alpha.

Question: which of these 3 blobs is Lyman-C actually ‘leaking’ from?

In a surpise twist, it turns out that Blob A is the most likely source by far. Although it is currently impossible to detect Ly-C with a sufficiently good spatial accuracy, the authors are able to measure which of the blobs are most ‘transparent to us’ using ratios of OIII and OII emission.

If this is true, it flies in the face of most of the theories mentionned above. It has often been assumed that Ly-C sources are a special type of Lyman-alpha sources, but here, it is the weakest Lyman-alpha emitter which is most transparent. And the X-ray sources present in Blobs B and C have not ‘carved escape channels’ as expected.

Link to article: arxiv

The Statistical Properties of Neutral Gas at z<1.65 from UV Measurements of Damped Lyman Alpha Systems

This paper talks about the occurrence of Mg II absorptions systems and their relation to DLAs at z<1.7 (4By – present). At late times, DLAs contain the majority of the neutral hydrogen in the Universe, after Reionisation has ended. They are detected as broad features in the Lyman-alpha forest of quasars, meaning they cannot be counted beyond redshifts of ~5 (1By) where the Gunn-Peterson trough saturates the Lyman-alpha forests. However, the number of DLAs is found to decrease dramatically from redshifts 5 to 1. This is usually interpreted as the leftover pocket of neutral H being destroyed by UV radiation long after Reionisation itself has finished. (this paper figure 3).

DLAs often occur together with a MgII doublet. In fact this paper uses Mg II to detect DLAs, and the authors address the fact that they would miss ultra-low metallicities DLAs. A DLA would have an ultra-low metallicity (at these late times!) if it contained no stars capable of providing magnesium, as such stars would also destroy the DLA. This would require the DLA cloud to be self-shielded, but also have a mass below the Jeans mass, which is very small range of masses. So the presence of ultra-low metallicity DLAs is only a problem at early times, when the structure just hasn’t yet had enough time to collapse. Later, the contribution to the total neutral hydrogen budget from such objects is negligible.

A very interesting measurement (for me) is shown in Figure 2. This shows that strong Mg II systems are good indicators of finding a DLA, but weak systems are not, and in the limit of very weak Mg II, virtually none are part of a DLA. This makes it clear that there are 2 populations of Mg II, described by different physical origins and statistical distributions.

If this is causally linked with the increase in DLAs at high redshift, it could be that weak Mg II systems are also part of DLAs at z>2. This would mean that systems with lower metallicity lose their DLAs (become ionised) first, presumably because the have lower density, so

  • They have lower densities, so fewer stars, so they form less MgII, and
  • They have lower densities, so they are more easily destroyed by the UVB and dissapear first.

This is the generally accepted view of late DLAs subsisting due to higher than average densities.

Not part of this paper: where do all the ultra-weak (W<0.3A) Mg II systems come from, which never have proper DLAs associated with them? Maybe they are just telling us something about the low-metallicity IMF.

Link to article: arxiv

sources: Zwaan05, Nestor16, Nosterdaeme12 for z>2

 

Charged Thin-shell Gravastars in Noncommutative Geometry and Cosmic Censorship

This paper considers an alternative to black holes, gravastars, in which different types of spacetimes are glued together with appropriate junction conditions. One nice feature is that gravastars do not need to involve event horizons.

The topic is mathematically rather opaque and not within my expertise by a stretch. It deserves a mention for the spelling mistake ‘cut and past technique’ which appears twice in this paper.

Best accidental pun in an academic paper award!

Link to article: ads