The NTCIR Fair Web Task


Last updated: 15th March, 2023.
Twitter: @ntcirfairweb


Updates

Test topics can be found here! Run submission instructions can be found here! Baseline runs with the html files (for ReRanked runs) can be found here! (March 15)
(NOTE: We might drop some topics for the official evaluation if we fail to find relevant entities for them.)
The GFR (Group Fairness and Relevance) evaluation framework paper (Sakai/Kim/Kang) has been accepted by ACM TOIS! The finalised paper will be made publicly available soon! (March 14)

Target corpus announced! Our task will use Chuweb21D-60 (49.8M web pages). Available from the Chuweb21D page! (February 27, 2023)
Pilot data can be found here! This package demonstrates how our pilot runs (for four sample topics) were actually evaluated using our framework. (February 15, 2023)
Here are the four sample topics! (October 3, 2022)


Task Specifications

Introduction to FairWeb-1 (slides; updated Feb 2023)


Timeline (GMT+9 except where indicated)

blue: what participants will do
red: what organisers will do

October 3, 2022 Release of 1st CFP with sample topics and evaluation protocol [DONE]
February 15, 2023 Pilot relevance assessments for the sample topics and a few pilot runs released [DONE]
February 1-March 10, 2023 Topic development [DONE]
March 15, 2023 Topics released [DONE]
April 15, 2023 Task registrations due (Anywhere on Earth). Sign up from this NTCIR-17 page!
May 16, 2023 Run submissions due
May 17-July 31, 2023 Entity annotations; runs evaluated
August 1, 2023 Evaluation results and draft overview released
September 1, 2023 Draft participant papers due
November 1, 2023 Camera ready papers due
December 2023 NTCIR-17@NII, Tokyo, Japan

Papers

(In Japanese) 酒井哲也: グループフェアネスを考慮したウェブ検索タスク, 情報処理学会研究報告 2022-DBS-175/2022-IFAT-148, No.6, 2022. paper slides

Tetsuya Sakai, Jin Young Kim, and Inho Kang: A Versatile Framework for Evaluating Ranked Lists in terms of Group Fairness and Relevance, arXiv:2204.00280, 2022.

Tetsuya Sakai, Jin Young Kim, and Inho Kang: A Versatile Framework for Evaluating Ranked Lists in terms of Group Fairness and Relevance, ACM TOIS, to appear, 2023.


Links

NTCIR
TREC Fair Ranking Track


Organisers

Email: fairweb1org at
list.waseda.jp

Sijie Tao, Nuo Chen, Tetsuya Sakai (Waseda University, Japan)
Zhumin Chu (Tsinghua University, P.R.C.)
Nicola Ferro (University of Padua, Italy)
Maria Maistro (University of Copenhagen, Denmark)
Ian Soboroff (NIST, USA)
Hiromi Arai (RIKEN AIP, Japan)