Upcoming weak lensing surveys will probe large fractions of the sky with unprecedented accuracy. ... more Upcoming weak lensing surveys will probe large fractions of the sky with unprecedented accuracy. To infer cosmological constraints, a large ensemble of survey simulations are required to accurately model cosmological observables and their covariances. We develop a parallelized multi-lens-plane pipeline called UFalcon, designed to generate full-sky weak lensing maps from lightcones within a minimal runtime. It makes use of L-PICOLA, an approximate numerical code, which provides a fast and accurate alternative to cosmological N-Body simulations. The UFalcon maps are constructed by nesting 2 simulations covering a redshift-range from z=0.1 to 1.5 without replicating the simulation volume. We compute the convergence and projected overdensity maps for L-PICOLA in the lightcone or snapshot mode. The generation of such a map, including the L-PICOLA simulation, takes about 3 hours walltime on 220 cores. We use the maps to calculate the spherical harmonic power spectra, which we compare to t...
Weak lensing by large scale structure or 'cosmic shear' is a potentially powerful cosmolo... more Weak lensing by large scale structure or 'cosmic shear' is a potentially powerful cosmological probe to shed new light on Dark Matter, Dark Energy and Modified Gravity. It is based on the weak distortions induced by large-scale structures on the observed shapes of distant galaxies through gravitational lensing. While the potentials of this purely gravitational effect are great, results from this technique have been hampered because the measurement of this weak effect is difficult and limited by systematics effects. In particular, a demanding step is the measurement of the weak lensing shear from wide field CCD images of galaxies. We describe the origin of the problem and propose a way forward for cosmic shear. Our proposed approach is based on Monte-Carlo Control Loops and draws upon methods widely used in particle physics and engineering. We describe the control loop scheme and show how it provides a calibration method based on fast image simulations tuned to reproduce the ...
We present extended modeling of the strong lens system RXJ1131-1231 with archival data in two HST... more We present extended modeling of the strong lens system RXJ1131-1231 with archival data in two HST bands in combination with existing line-of-sight contribution and velocity dispersion estimates. Our focus is on source size and its influence on time-delay cosmography. We therefore examine the impact of mass-sheet degeneracy and especially the degeneracy pointed out by Schneider & Sluse (2013) using the source reconstruction scale. We also extend on previous work by further exploring the effects of priors on the kinematics of the lens and the external convergence in the environment of the lensing system. Our results coming from RXJ1131-1231 are given in a simple analytic form so that they can be easily combined with constraints coming from other cosmological probes. We find that the choice of priors on lens model parameters and source size are subdominant for the statistical errors for H_0 measurements of this systems. The choice of prior for the source is sub-dominant at present (2 u...
Quantifying the concordance between different cosmological experiments is important for testing t... more Quantifying the concordance between different cosmological experiments is important for testing the validity of theoretical models and systematics in the observations. In earlier work, we thus proposed the Surprise, a concordance measure derived from the relative entropy between posterior distributions. We revisit the properties of the Surprise and describe how it provides a general, versatile, and robust measure for the agreement between datasets. We also compare it to other measures of concordance that have been proposed for cosmology. As an application, we extend our earlier analysis and use the Surprise to quantify the agreement between WMAP 9, Planck 13 and Planck 15 constraints on the ΛCDM model. Using a principle component analysis in parameter space, we find that the large Surprise between WMAP 9 and Planck 13 (S = 17.6 bits, implying a deviation from consistency at 99.8 due to a shift along a direction that is dominated by the amplitude of the power spectrum. The Planck 15 ...
In light of the growing number of cosmological observations, it is important to develop versatile... more In light of the growing number of cosmological observations, it is important to develop versatile tools to quantify the constraining power and consistency of cosmological probes. Originally motivated from information theory, we use the relative entropy to compute the information gained by Bayesian updates in units of bits. This measure quantifies both the improvement in precision and the 'surprise', i.e. the tension arising from shifts in central values. Our starting point is a WMAP9 prior which we update with observations of the distance ladder, supernovae (SNe), baryon acoustic oscillations (BAO), and weak lensing as well as the 2015 Planck release. We consider the parameters of the flat ΛCDM concordance model and some of its extensions which include curvature and Dark Energy equation of state parameter w. We find that, relative to WMAP9 and within these model spaces, the probes that have provided the greatest gains are Planck (10 bits), followed by BAO surveys (5.1 bits) ...
Dark matter in the universe evolves through gravity to form a complex network of halos, filaments... more Dark matter in the universe evolves through gravity to form a complex network of halos, filaments, sheets and voids, that is known as the cosmic web. Computational models of the underlying physical processes, such as classical N-body simulations, are extremely resource intensive, as they track the action of gravity in an expanding universe using billions of particles as tracers of the cosmic matter distribution. Therefore, upcoming cosmology experiments will face a computational bottleneck that may limit the exploitation of their full scientific potential. To address this challenge, we demonstrate the application of a machine learning technique called Generative Adversarial Networks (GAN) to learn models that can efficiently generate new, physically realistic realizations of the cosmic web. Our training set is a small, representative sample of 2D image snapshots from N-body simulations of size 500 and 100 Mpc. We show that the GAN-generated samples are qualitatively and quantitative...
We extend the results of previous analyses towards constraining the abundance and clustering of p... more We extend the results of previous analyses towards constraining the abundance and clustering of post-reionization (z ∼ 0-5) neutral hydrogen (HI) systems using a halo model framework. We work with a comprehensive HI dataset including the small-scale clustering, column density and mass function of HI galaxies at low redshifts, intensity mapping measurements at intermediate redshifts and the UV/optical observations of Damped Lyman Alpha (DLA) systems at higher redshifts. We use a Markov Chain Monte Carlo (MCMC) approach to constrain the parameters of the best-fitting models, both for the HI-halo mass relation and the HI radial density profile. We find that a radial exponential profile results in a good fit to the low-redshift HI observations, including the clustering and the column density distribution. The form of the profile is also found to match the high-redshift DLA observations, when used in combination with a three-parameter HI-halo mass relation and a redshift evolution in the...
We present the scientific performance results of PynPoint, our Python-based software package that... more We present the scientific performance results of PynPoint, our Python-based software package that uses principle component analysis to detect and estimate the flux of exoplanets in two dimensional imaging data. Recent advances in adaptive optics and imaging technology at visible and infrared wavelengths have opened the door to direct detections of planetary companions to nearby stars, but image processing techniques have yet to be optimized. We show that the performance of our approach gives a marked improvement over what is presently possible using existing methods such as LOCI. To test our approach, we use real angular differential imaging (ADI) data taken with the adaptive optics assisted high resolution near-infrared camera NACO at the VLT. These data were taken during the commissioning of the apodising phase plate (APP) coronagraph. By inserting simulated planets into these data, we test the performance of our method as a function of planet brightness for different positions on...
Modeling the Point Spread Function (PSF) of wide-field surveys is vital for many astrophysical ap... more Modeling the Point Spread Function (PSF) of wide-field surveys is vital for many astrophysical applications and cosmological probes including weak gravitational lensing. The PSF smears the image of any recorded object and therefore needs to be taken into account when inferring properties of galaxies from astronomical images. In the case of cosmic shear, the PSF is one of the dominant sources of systematic errors and must be treated carefully to avoid biases in cosmological parameters. Recently, forward modeling approaches to calibrate shear measurements within the Monte-Carlo Control Loops (MCCL) framework have been developed. These methods typically require simulating a large amount of wide-field images, thus, the simulations need to be very fast yet have realistic properties in key features such as the PSF pattern. Hence, such forward modeling approaches require a very flexible PSF model, which is quick to evaluate and whose parameters can be estimated reliably from survey data. W...
We explore a new technique to measure cosmic shear using Einstein rings. In Birrer et al. (2017),... more We explore a new technique to measure cosmic shear using Einstein rings. In Birrer et al. (2017), we showed that the detailed modelling of Einstein rings can be used to measure external shear to high precision. In this letter, we explore how a collection of Einstein rings can be used as a statistical probe of cosmic shear. We present a forecast of the cosmic shear information available in Einstein rings for different strong lensing survey configurations. We find that, assuming that the number density of Einstein rings in the COSMOS survey is representative, future strong lensing surveys should have a cosmological precision comparable to the current ground based weak lensing surveys. We discuss how this technique is complementary to the standard cosmic shear analyses since it is sensitive to different systematic and can be used for cross-calibration.
Recent observations have led to the establishment of the concordance LCDM model for cosmology. A ... more Recent observations have led to the establishment of the concordance LCDM model for cosmology. A number of experiments are being planned to shed light on dark energy, dark matter, inflation and gravity, which are the key components of the model. To optimize and compare the reach of these surveys, several figures of merit have been proposed. They are based on either the forecasted precision on the LCDM model and its expansion, or on the expected ability to distinguish two models. We propose here another figure of merit that quantifies the capacity of future surveys to rule out the LCDM model. It is based on a measure of the difference in volume of observable space that the future surveys will constrain with and without imposing the model. This model breaking figure of merit is easy to compute and can lead to different survey optimizations than other metrics. We illustrate its impact using a simple combination of supernovae and BAO mock observations and compare the respective merit of...
Weak lensing surveys provide a powerful probe of dark energy through the measurement of the mass ... more Weak lensing surveys provide a powerful probe of dark energy through the measurement of the mass distribution of the local Universe. A number of ground-based and space-based surveys are being planned for this purpose. Here, we study the optimal strategy for these future surveys using the joint constraints on the equation of state parameter wn and its evolution wa as a figure of merit by considering power spectrum tomography. For this purpose, we first consider an `ideal' survey which is both wide and deep and exempt from systematics. We find that such a survey has great potential for dark energy studies, reaching one sigma precisions of 1% and 10% on the two parameters respectively. We then study the relative impact of various limitations by degrading this ideal survey. In particular, we consider the effect of sky coverage, survey depth, shape measurements systematics, photometric redshifts systematics and uncertainties in the non-linear power spectrum predictions. We find that,...
Journal of Cosmology and Astroparticle Physics, 2021
Narrow-band imaging surveys allow the study of the spectral characteristics of galaxies without t... more Narrow-band imaging surveys allow the study of the spectral characteristics of galaxies without the need of performing their spectroscopic follow-up. In this work, we forward-model the Physics of the Accelerating Universe Survey (PAUS) narrow-band data. The aim is to improve the constraints on the spectral coefficients used to create the galaxy spectral energy distributions (SED) of the galaxy population model in Tortorelli et al. 2020. In that work, the model parameters were inferred from the Canada-France-Hawaii Telescope Legacy Survey (CFHTLS) data using Approximate Bayesian Computation (ABC). This led to stringent constraints on the B-band galaxy luminosity function parameters, but left the spectral coefficients only broadly constrained. To address that, we perform an ABC inference using CFHTLS and PAUS data. This is the first time our approach combining forward-modelling and ABC is applied simultaneously to multiple datasets. We test the results of the ABC inference by comparin...
Likelihood-free inference provides a rigorous approach to performing Bayesian analysis using forw... more Likelihood-free inference provides a rigorous approach to performing Bayesian analysis using forward simulations only. The main advantage of likelihood-free methods is their ability to account for complex physical processes and observational effects in forward simulations. Here we explore the potential of likelihood-free forward modeling for Bayesian cosmological inference using the redshift evolution of the cluster abundance combined with weak-lensing mass calibration. We use two complementary likelihood-free methods, namely Approximate Bayesian Computation (ABC) and Density-Estimation Likelihood-Free Inference (DELFI), to develop an analysis procedure for the inference of the cosmological parameters (Ωm, σ 8) and the mass scale of the survey sample. Adopting an eROSITA-like selection function and a 10% scatter in the observable–mass relation in a flat ΛCDM cosmology with Ωm = 0.286 and σ 8 = 0.82, we create a synthetic catalog of observable-selected Navarro–Frenk–White clusters in...
We present a joint weak lensing and X-ray analysis of 4 deg 2 from the CFHTLS and XMM-LSS surveys... more We present a joint weak lensing and X-ray analysis of 4 deg 2 from the CFHTLS and XMM-LSS surveys. Our weak lensing analysis is the first analysis of a real survey using shapelets, a new generation weak lensing analysis method. We create projected mass maps of the images, and extract 6 weak-lensing-detected clusters of galaxies. We show that their counts can be used to constrain the power spectrum normalisation σ 8 = 0.92 +0.26 -0.30 for Ω m = 0.24. We show that despite the large scatter generally observed in the M-T relation derived from lensing masses, tight constraints on both its slope and normalisation M * can be obtained with a moderate number of sources provided that the covered mass range is large enough. Adding clusters from to our sample, we measure M * = 2.71 +0.79 -0.61 10 14 h -1 M ⊙ . Although they are dominated by shot noise and sample variance, our measurements are consistent with currently favoured values, and set the stage for future surveys. We thus investigate the dependence of those estimates on survey size, depth, and integration time, for joint weak lensing and X-ray surveys. We show that deep surveys should be dedicated to the study of the physics of clusters and groups of galaxies. For a given exposure time, wide surveys provide a larger number of detected clusters and are therefore preferred for the measurement of cosmological parameters such as σ 8 and M * . We show that a wide survey of a few hundred square degrees is needed to improve upon current measurements of these parameters. More ambitious surveys covering 7000 deg 2 will provide the 1% accuracy in the estimation of the power spectrum and the M-T relation normalisations.
We present high-contrast observations of the circumstellar environment of the Herbig Ae/Be star H... more We present high-contrast observations of the circumstellar environment of the Herbig Ae/Be star HD100546. The final 3.8 µm image reveals an emission source at a projected separation of 0.48 ′′ ±0.04 ′′ (corresponding to ∼47±4 AU) at a position angle of 8.9 • ±0.9 • . The emission appears slightly extended with a point source component with an apparent magnitude of 13.2 ± 0.4 mag. The position of the source coincides with a local deficit in polarization fraction in near-infrared polarimetric imaging data, which probes the surface of the well-studied circumstellar disk of HD100546. This suggests a possible physical link between the emission source and the disk. Assuming a disk inclination of ∼47 • the de-projected separation of the object is ∼68 AU. Assessing the likelihood of various scenarios we favor an interpretation of the available high-contrast data with a planet in the process of forming. Followup observations in the coming years can easily distinguish between the different possible scenarios empirically. If confirmed, HD100546 "b" would be a unique laboratory to study the formation process of a new planetary system, with one giant planet currently forming in the disk and a second planet possibly orbiting in the disk gap at smaller separations.
Journal of Cosmology and Astroparticle Physics, 2021
The combination of different cosmological probes offers stringent tests of the ΛCDM model and enh... more The combination of different cosmological probes offers stringent tests of the ΛCDM model and enhanced control of systematics. For this purpose, we present an extension of the lightcone generator UFalcon first introduced in Sgier et al. , enabling the simulation of a self-consistent set of maps for different cosmological probes. Each realization is generated from the same underlying simulated density field, and contains full-sky maps of different probes, namely weak lensing shear, galaxy overdensity including RSD, CMB lensing, and CMB temperature anisotropies from the ISW effect. The lightcone generation performed by UFalcon is parallelized and based on the replication of a large periodic volume simulated with the GPU-accelerated N -Body code PkdGrav3. The post-processing to construct the lightcones requires only a runtime of about 1 walltime-hour corresponding to about 100 CPUhours. We use a randomization procedure to increase the number of quasi-independent full-sky UFalcon map-realizations, which enables us to compute an accurate multi-probe covariance matrix. Using this framework, we forecast cosmological parameter constraints by performing a multi-probe likelihood analysis for a combination of simulated future stage-IV-like surveys. We find that the inclusion of the cross-correlations between the probes significantly increases the information gain in the parameter constraints. We also find that the use of a non-Gaussian covariance matrix is increasingly important, as more probes and cross-correlation power spectra are included. A version of the UFalcon package currently including weak gravitational lensing is publicly available. 2
We demonstrate the potential of Deep Learning methods for measurements of cosmological parameters... more We demonstrate the potential of Deep Learning methods for measurements of cosmological parameters from density fields, focusing on the extraction of non-Gaussian information. We consider weak lensing mass maps as our dataset. We aim for our method to be able to distinguish between five models, which were chosen to lie along the $\sigma_8$ - $\Omega_m$ degeneracy, and have nearly the same two-point statistics. We design and implement a Deep Convolutional Neural Network (DCNN) which learns the relation between five cosmological models and the mass maps they generate. We develop a new training strategy which ensures the good performance of the network for high levels of noise. We compare the performance of this approach to commonly used non-Gaussian statistics, namely the skewness and kurtosis of the convergence maps. We find that our implementation of DCNN outperforms the skewness and kurtosis statistics, especially for high noise levels. The network maintains the mean discrimination ...
Combining Galaxy-Galaxy Lensing and Galaxy Clustering
Combining galaxy-galaxy lensing and galaxy clustering is a promising method for inferring the gro... more Combining galaxy-galaxy lensing and galaxy clustering is a promising method for inferring the growth rate of large scale structure, a quantity that will shed light on the mechanism driving the acceleration of the Universe. The Dark Energy Survey (DES) is a prime candidate for such an analysis, with its measurements of both the distribution of galaxies on the sky and the tangential shears of background galaxies induced by these foreground lenses. By constructing an end-to-end analysis that combines large-scale galaxy clustering and small-scale galaxy-galaxy lensing, we also forecast the potential of a combined probes analysis on DES datasets. In particular, we develop a practical approach to a DES combined probes analysis by jointly modeling the assumptions and systematics affecting the different components of the data vector, employing a shared halo model, HOD parametrization, photometric redshift errors, and shear measurement errors. Furthermore, we study the effect of external pri...
Uploads
Papers by Adam Amara