GREAT3 Mid-challenge Meeting

GREAT3 Mid-challenge Meeting GREAT3 Mid-challenge Meeting
  • Contact

    The Royal Observatory Edinburg

  • Working language

    English

The information is outdated?

Please let us know

Scientific background

In our currently accepted cosmological model, the baryonic matter from which stars and planets are made accounts for only 4% of the energy density of the Universe. In order to explain a wide variety of cosmological observations, we have been forced to posit the existence of dark matter (which we detect through its gravitational attraction alone) and dark energy (which causes a repulsion that is driving the accelerated expansion of the Universe, the discovery of which led to the 2011 Nobel Prize in Physics). While we infer the existence of these dark components, the question of what they actually are remains a mystery.

Gravitational lensing is the deflection of light from distant objects by the matter along its path – all of that matter, including the dark matter. Lensing measurements are thus directly sensitive to dark matter. They also allow us to infer the properties of dark energy, because the accelerated expansion of the Universe that it causes directly opposes the effects of gravity, which tends to cause clumping of matter into larger and larger structures.

This measurement relies on the small but spatially coherent distortions (known as weak shears) in the shapes of distant galaxies, which provide a statistical map of the large-scale structures along the line-of-sight. Because of the sensitivity of weak lensing to gravity and to the dark components of the Universe, the astronomical community has designed a number of upcoming experiments that will measure it very precisely, and thereby constrain cosmological parameters. However, the increasing size of these experiments, and the decreasing statistical errors, comes with a price: to fully take advantage of their promise, we must understand systematic errors increasingly precisely as well. The coherent lensing distortions of galaxy shapes are typically ~1% in magnitude, far smaller than galaxy intrinsic ellipticities, and more problematically, smaller than the coherent shape distortions due to light propagation through the atmosphere and telescope optics (collectively termed the point spread function, or PSF). Removing the effects of the PSF and measuring the lensing shears for those galaxies which are only moderately resolved and have limited signal-to-noise, is a demanding applied statistical problem that has not been solved adequately for upcoming surveys. Systematic errors related to shape measurement must be reduced by factors of ~5-10 in the next decade. The goal of the GREAT3 challenge is to facilitate further work in understanding existing methods of PSF correction, to suggest ways that they can be developed and improved in the future, and to spur the creation of new methods that solve the limitations of existing ones.

What this challenge involves

The GREAT3 challenge will involve the release of simulated ground- and space-based data for blind tests of shear recovery, just like previous challenges. However, we will also release the GalSim software used to create the simulations (blinding only the input values for parameter distributions and lensing shears) as a development tool for those who want to improve their shape measurement methods.

What we will test with this challenge

This challenge will focus on three specific problems related to shear estimation in 3 separate branches (in addition to having a fourth branch that includes all three problems):

  • Are shape measurement techniques sensitive to realistic galaxy morphology? 
  • What is the effect of using real galaxy images as sources rather than simple parametric forms?
  • What is the effect of realistic uncertainty concerning the PSF? How sensitive is shape measurement to realistic levels of uncertainty in PSF models when compared to the fully known PSF case?
  • What is the effect of multi-exposure imaging on shape measurement? Upcoming surveys will take many short exposures of data, rather than single long exposures, some of which will have different PSFs and some of which may be significantly under-sampled: how is shape measurement affected?

More info on this website!

Track this event on your google calendar