Issue |
A&A
Volume 698, May 2025
|
|
---|---|---|
Article Number | A61 | |
Number of page(s) | 19 | |
Section | Numerical methods and codes | |
DOI | https://doi.org/10.1051/0004-6361/202553785 | |
Published online | 06 June 2025 |
Deep learning inference with the Event Horizon Telescope
II. The ZINGULARITY framework for Bayesian artificial neural networks
1
Department of Astrophysics, Institute for Mathematics, Astrophysics and Particle Physics (IMAPP),
Radboud University, PO Box 9010,
6500
GL Nijmegen,
The Netherlands
2
Max-Planck-Institut für Radioastronomie,
Auf dem Hügel 69,
53121
Bonn,
Germany
3
Steward Observatory and Department of Astronomy, University of Arizona,
933 N. Cherry Ave.,
Tucson,
AZ
85721,
USA
4
Data Science Institute, University of Arizona,
1230 N. Cherry Ave.,
Tucson,
AZ
85721,
USA
5
Program in Applied Mathematics, University of Arizona,
617 N. Santa Rita Ave.,
Tucson,
AZ
85721,
USA
6
Department of Astrophysical Sciences, Peyton Hall, Princeton University,
Princeton,
NJ
08544,
USA
7
Instituto de Astrofísica de Andalucía-CSIC,
Glorieta de la Astronomía s/n,
18008
Granada,
Spain
★ Corresponding author: M.Janssen@astro.ru.nl
Received:
16
January
2025
Accepted:
30
March
2025
Context. In this second paper in our publication series, we present the open-source ZINGULARITY framework for parameter inference with deep Bayesian artificial neural networks. We carried out supervised learning with synthetic millimeter very long baseline interferometry observations of the Event Horizon Telescope (EHT). Our ground-truth models are based on general relativistic magnetohydrodynamic simulations of Sgr A* and M87* on horizon scales. The models predict the synchrotron emission produced by these accreting supermassive black hole systems.
Aims. We investigated how well ZINGULARITY neural networks are able to infer key model parameters from EHT observations, such as the black hole spin and the magnetic state of the accretion disk, when uncertainties in the data are accurately taken into account.
Methods. ZINGULARITY makes use of the TENSORFLOW PROBABILITY library and is able to handle large amounts of data with a combination of the efficient TFRecord data format plus the HOROVOD framework for distributed deep learning. Our approach is the first analysis of EHT data with Bayesian neural networks, where an unprecedented training data size, under consideration of a closely modeled EHT signal path, and the full information content of the observational data are used. ZINGULARITY infers parameters based on salient features in the data and is containerized for scientific reproducibility.
Results. Through parameter surveys and dedicated validation tests, we identified neural network architectures, that are robust against internal stochastic processes and unaffected by noise in the observational and model data. We give examples of how different data properties affect the network training. We show how the Bayesian nature of our networks gives trustworthy uncertainties and uncovers failure modes for uncharacterizable data.
Conclusions. It is easy to achieve low validation errors during training on synthetic data with neural networks, particularly when the forward modeling is too simplified. Through careful studies, we demonstrate that our trained networks can generalize well so that reliable results can be obtained from observational data.
Key words: methods: data analysis / techniques: high angular resolution / techniques: interferometric
© The Authors 2025
Open Access article, published by EDP Sciences, under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
This article is published in open access under the Subscribe to Open model. Subscribe to A&A to support open access publication.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.