TREC-COVID Archive

Round 5 Runs

TREC-COVID received a toal of 126 runs from 28 participating teams in Round 5. The limit on the number of runs per team was raised to eight for this final round.

The submitted run file plus evaluation report for each run are listed here. Note that the submission file is the file that was evaluated, meaning that all previously judged documents have been removed from it. This includes previously judged documents whose doc-ids changed in the Round 5 version of CORD-19.

A ranking of top runs as sorted by mean score for one of five measures, NDCG@20, Precision@20, Brepf, RBP(p=.5), and MAP, are here. The runs included in the table are runs that scored in the top 60 runs by at least one of these measures. There may be runs who have better scores for some measure than a run included in the table for that measure.

Round 4 Runs

TREC-COVID received 72 runs from 27 participating teams in Round 4 (including baseline runs from the "anserini" team). The submitted run file plus evaluation report for each run are listed here. Note that the submission file is the file that was evaluated, meaning that all previously judged documents have been removed from it. This includes previously judged documents whose doc-ids changed in the Round 4 version of CORD-19.

A ranking of top runs as sorted by mean score for one of five measures, NDCG@20, Precision@20, Brepf, RBP(p=.5), and MAP, are here. The runs included in the table are runs that scored in the top 35 runs by at least one of these measures. There may be runs who have better scores for some measure than a run included in the table for that measure.

Round 3 Runs

TREC-COVID received 79 runs from 31 participating teams in Round 3 (totals include three baseline runs from the "anserini" team). The submitted run file plus evaluation report for each run are listed here. Note that the submission file is the file that was evaluated, meaning that all previously judged documents have been removed from it. Since there were previously judged documents whose doc-ids changed between the Round 1 and Round 2 judgment sets and the Round 3 data sets, these documents were removed from submissions by NIST. Almost all runs had some documents removed.

A ranking of top runs as sorted by mean score for one of five measures, NDCG@10, Precision@5, Brepf, RBP(p=.5), and MAP, are here.

Round 2 Runs

TREC-COVID received 136 runs from 51 participating teams in Round 2 (totals include two baseline runs from the "anserini" team). The submitted run file plus evaluation report for each run is listed here.

A ranking of top runs as sorted by mean score for one of five measures, NDCG@10, Precision@5, Brepf, RBP(p=.5), and MAP, is here.

Round 1 Runs

TREC-COVID received 143 runs from 56 participating teams in Round 1. For each run the submitted run file plus evaluation report of the run are listed here.

A ranking of top runs as sorted by mean score for one of four measures, NDCG@10, Precision@5, Brepf, and MAP, is here.