Issue |
A&A
Volume 666, October 2022
|
|
---|---|---|
Article Number | A89 | |
Number of page(s) | 11 | |
Section | Numerical methods and codes | |
DOI | https://doi.org/10.1051/0004-6361/202243450 | |
Published online | 13 October 2022 |
CENN: A fully convolutional neural network for CMB recovery in realistic microwave sky simulations
1
Departamento de Física, Universidad de Oviedo,
C. Federico García Lorca 18,
33007
Oviedo, Spain
e-mail: casasjm@uniovi.es
2
Instituto Universitario de Ciencias y Tecnologías Espaciales de Asturias (ICTEA),
C. Independencia 13,
33004
Oviedo, Spain
3
SISSA,
Via Bonomea 265,
34136
Trieste, Italy
4
IFPU – Institute for fundamental physics of the Universe,
Via Beirut 2,
34014
Trieste, Italy
5
INFN-Sezione di Trieste,
via Valerio 2,
34127
Trieste, Italy
6
Escuela de Ingeniería de Minas,
Energía y Materiales Independencia 13,
33004
Oviedo, Spain
Received:
2
March
2022
Accepted:
27
July
2022
Context. Component separation is the process with which emission sources in astrophysical maps are generally extracted by taking multi-frequency information into account. It is crucial to develop more reliable methods for component separation for future cosmic microwave background (CMB) experiments such as the Simons Observatory, the CMB-S4, or the LiteBIRD satellite.
Aims. We aim to develop a machine learning method based on fully convolutional neural networks called the CMB extraction neural network (CENN) in order to extract the CMB signal in total intensity by training the network with realistic simulations. The frequencies we used are the Planck channels 143, 217, and 353 GHz, and we validated the neural network throughout the sky and at three latitude intervals: 0° < |b| < 5°, 5° < |b| < 30°, and 30° < |b| < 90°, Moreover, we used neither Galactic nor point-source (PS) masks.
Methods. To train the neural network, we produced multi-frequency realistic simulations in the form of patches of 256 × 256 pixels that contained the CMB signal, the Galactic thermal dust, cosmic infrared background, and PS emissions, the thermal Sunyaev–Zel’dovich effect from galaxy clusters, and instrumental noise. After validating the network, we compared the power spectra from input and output maps. We analysed the power spectrum from the residuals at each latitude interval and throughout the sky, and we studied how our model handled high contamination at small scales.
Results. We obtained a CMB power spectrum with a mean difference between input and output of 13 ± 113 µK2 for multipoles up to above 4000. We computed the residuals, obtaining 700 ± 60 µK2 for 0° < |b| < 5°, 80 ± 30 µK2 for 5° < |b| < 30°, and 30 ± 20 µK2 for 30° < |b| < 90° for multipoles up to above 4000. For the entire sky, we obtained 30 ± 10 µK2 for l ≤ 1000 and 20 ± 10 µK2 for l ≤ 4000. We validated the neural network in a single patch with strong contamination at small scales, obtaining a difference between input and output of 50 ± 120 µK2 and residuals of 40 ± 10 µK2 up to l ~ 2500. In all cases, the uncertainty of each measure was taken as the standard deviation.
Conclusions. The results show that fully convolutional neural networks are promising methods for performing component separation in future CMB experiments. Moreover, we show that CENN is reliable against different levels of contamination from Galactic and PS foregrounds at both large and small scales.
Key words: techniques: image processing / cosmic background radiation / submillimeter: general
© J. M. Casas et al. 2022
Open Access article, published by EDP Sciences, under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
This article is published in open access under the Subscribe-to-Open model. Subscribe to A&A to support open access publication.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.