Sparsity and Deep Learning for Modern Surveys

Statistical Challenges in 21st Century Cosmology -- Valencia, Spain


François Lanusse

The Large Synoptic Survey Telescope

  • 18,000 square degrees, observed once every few days
  • 1000 images each night , 15 TB/night for 10 years
  • Tens of billions of objects, each one observed $\sim1000$ times


$\Longrightarrow$ Unprecedented volume of data
Huang et al. (2017)


$\Longrightarrow$ Unprecedented complexity of data

The challenges of modern surveys


  • Handling the increased data rates

  • Accessing the information present in the data

  • Controlling systematic errors

Sparsity

$y = \mathbf{A} x + n$

$\mathbf{A}$ non-invertible or ill-conditioned
$\Longrightarrow$ ill-posed inverse problem with no unique solution $ x$
Deconvolution
Inpainting
Denoising

The sparse recovery framework

  • Sparsity as a powerful and generic signal prior
  • Fast algorithms for recovering a MAP solution

    $\mathrm{argmin}_{x} \quad \parallel y - A x \parallel_2^2 \ + \ \lambda \parallel \Phi^* x \parallel_1$

The GLIMPSE mass-mapping algorithm


Peel, Lanusse, Starck (2017)
$\mathrm{argmin}_\kappa \frac{1}{2} \parallel \mathcal{C}_\kappa^{-1} ( (1 - \kappa) g - \mathbf{T} \ \mathbf{Q} \ \mathbf{F} \kappa ) \parallel_2^2 + \lambda \parallel \Phi^* \kappa \parallel_1 + \ i_{\mathbb{R}}(\kappa)$
$\mathrm{argmin}_\delta \frac{1}{2} \parallel \mathcal{C}_\kappa^{-1} ( (1-\kappa) g - \mathbf{T} \ \mathbf{P} \ \mathbf{Q} \ \mathbf{F} \delta ) \parallel_2^2 + \lambda \parallel \Phi^* \delta \parallel_1 + \ i_{>0}(\delta)$

  • Includes reduced shear and individual redshift PDFs
  • Solved using proximal adapted from (Vu, 2013)
Lanusse et al. (2016)

Lanusse & Starck, in prep.

3D reconstruction of the COSMOS field

Transverse Wiener Filter
Simon et al. (2012)
Glimpse
Lanusse & Starck, in prep.

A few comments



$\mathrm{argmin}_{x} \quad \parallel y - A x \parallel_2^2 \ + \ \lambda \parallel \Phi^* x \parallel_1$

  • Relies on knowledge of $\mathbf{A}$

  • Requires expert knowledge to choose $\Phi$

Deep Learning

In the news lately...

  • Self-driving Uber takes the road in Pittsburgh (Sept. 2016)

  • CMU's Libratus beats top poker players (Jan. 2017)

  • Google's AlphaGo beats world's top Go player (May 2017)

Technological revolution brought about by the advancement of Deep Learning.

A concrete example

WWAD: What Would an Astrophysicist Do ?

gri composite
g - $\alpha$ i
detected areas
HST images

RingFinder (Gavazzi et al. 2014)


Requires ~30 person-minute/sq. deg. for visual inspection
Gavazzi et al. (2014), Collett (2015)


Plainly intractable at the scale of LSST

A conventional Convolutional Neural Network


Preactivated Residual Unit
(He et al. 2016)

CMUDeepLens Architecture
(Lanusse et al. 2017)

Some CMUDeepLens results



True Positive Rate = $\frac{TP}{TP + FN}$

  • $TP$: True Positives
  • $FN$: False Negatives


False Positive Rate = $\frac{FP}{FP + TN}$

  • $FP$: False Positives
  • $TN$: True Negatives

The Euclid strong-lens finding challenge

Better accuracy than human visual inspection !

The promise of Deep Learning



  • Purely data-driven

  • Little expert knowledge necessary

  • No painstacking feature design

Deep Generative Models

The need for complex data models


Tenneti et al. (2015)

The need for data-driven generative models



  • Lack or inadequacy of physical model
  • Extremely computationally expensive simulations


Can we learn a model for the signal from the data ?

The evolution of generative models

  • Deep Belief Network
    (Hinton et al. 2006)

  • Variational AutoEncoder
    (Kingma & Welling 2014)

  • Generative Adversarial Network
    (Goodfellow et al. 2014)

  • Wasserstein GAN
    (Arjovsky et al. 2017)

A visual Turing test


Fake PixelCNN samples

Real SDSS

Learning COSMOS galaxy morphologies

Ravanbakhsh, Lanusse et al. (2017)

Learning the galaxy-halo connection

Lanusse et al. (in prep.)
Adapted from the network behind the cosmic web (credit: Kim Albrecht)

Deep Learning on graphs


Kipf & Welling (2017)
Preliminary results on graph Mixture Density Network model
Lanusse et al. (in prep.)

Conclusion

Conclusions

  • Mathematics based methods should be used whenever possible


  • Deep Learning allows you to automatize Machine Learning


  • Generative models will be a key element in making our simulations more realistic