Open Access

Table 5

SDC1 scores and related properties for source catalogs from different teams and methods.

Team (method) Ms (Score) Ndet Nmatch Nfalse NbadNfalse Purity s¯${\bar s}$
Post-challenge results

MINERVA (YOLO-CIANNA) 480450 724480 680000 44480 16839 93.86% 0.7719
purity-focus thresholds 418434 541 542 536412 5130 2506 99.06% 0.7896
JLRAT2 (JSFM2) 298201 502 146 484212 17 934 2274 96.43% 0.6529

Original challenge results

Engage-SKA (PROFOUND) 200939 421992 418384 3608 2677 99.15% 0.4889
Shanghai (multiple methods) 158841 292646 291 553 1093 698 99.63% 0.5486
ICRAR (CLARAN) 142784 279 898 259806 20092 6875 92.82% 0.6269

Notes. The bold element represents the target metric that is optimized for each result.

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.