Welcome to the Retinal Fundus Glaucoma Challenge! REFUGE is organized as a half day Challenge in conjunction with the 5th MICCAI Workshop on Ophthalmic Medical Image Analysis (OMIA), a Satellite Event of the MICCAI 2018 conference in Granada, Spain.

REFUGE challenge is partnering with OMIA to widen the opportunities to present your work at MICCAI. In addition to the traditional oral and poster presentations of OMIA, REFUGE offers the chance to try your software on a real challenge, this year with fundus images


The goal of the challenge is to evaluate and compare automated algorithms for glaucoma detection and optic disc/cup segmentation on a common dataset of retinal fundus images. 
We invite the medical image analysis community to participate by developing and testing existing and novel automated classification and segmentation methods.

Updates

  • Dec 1: ONLINE iChallenges moved to Baidu Research Open-Access Dataset (BROAD) for the better experience. http://ai.baidu.com/broad/introduction
    The 1st two iChallenges are iChallenge-GNO (REFUGE) and iChallenge-AMD.
  • OCT 20: Release the training data of AMD challenge with the 1st classification task ONLY on the REFUGE website.
  • OCT 19: Release the ground truth of the validation set and the on-site test images to all participants.
    A new bi-weekly updated online challenge on the test set will be open from Nov 4, and the corresponding leaderboards will be published on a new individual page.
  • OCT 17: Release the on-site leaderboards with the exact performance values of the 12 teams.
  • Sep 19: Welcome to OMIA-REFUGE, a few key points for the on-site participants: 1) Location: Conference Center, Lv -2, Room Machado; 2) Time: data (download link) releases on 11am and results submission ends on 3pm (teams can get roughly 4 hours to generate the results); 3) Oral talk: please send your slides to redkisses121@gmail.com or bring a thumb drive with you and pass me the slides before 3pm.
  • Aug 28: REFUGE program released, each team has 10 mins oral presentation including 3mins Q&A.
  • Aug 11: REFUGE ONLINE Challenge re-opens. Each team can submit results once every two weeks, and the IDENTICAL online leaderboards will be updated accordingly on Aug 16, Aug 30 and Sep 10, i.e. three times in prior to the on-site test. New participants can join these online leaderboards from any time. This is NOT related to the off-site/on-site challenge scores.
    The annotations of the VALIDATION set will be publicly available after on-site challenge, and the biweekly updated leaderboards on the TEST dataset will get online soon.
  • Aug 4: The FINAL on-site challenge is ranked with the Final Score = 0.3*Validation_Overall_Rank + 0.7*On-stie_Overall_Rank, and prizes will be awarded to teams with Top 3 Smallest Final Scores. Certificates will also be given to the top 3 teams on each individual task (No money).
  • Aug 4: REFUGE paper management system (CMT platform) opens: https://cmt3.research.microsoft.com/REFUGE2018/
    Please ReSubmit your Technical Report (i.e. Paper) (with full author list, i.e. names of all team members) together with the MICCAI w/ OMIA-REFUGE Registration Confirmation of A team member to this platform ASAP, IF an invitation letter from REFUGE is especially needed for your participation of REFUGE on-site challenge. The acceptance/invitation letter will be issued through the CMT system.
    To participate in the on-site REFUGE challenge, a team should satisfy the following criteria:
    1) At least 1 team member is registered for REFUGE event through MICCAI registration portal; 
    2) At least 1 team member is confirmed to show up in person on Sep 20 @ REFUGE venue (Room Machado, Conference Center, Granada); a team cannot be represented by any person not registered as a team member in the registration email; 
    3) At least be the top 50% (i.e. 14 teams from 28 qualified teams) on one leaderboard: Classification, Segmentation, or Overall. The overall score is calculated with 0.4*Classification_Rank + 0.6*Segmentation_Rank, which is the lower the better. If you are not sure whether your team is qualified to take part in the on-site challenge, please email redkisses121@gmail.com for an official confirmation.
    For teams qualified (top 50% on any leaderboard) but cannot send a team member to the on-site challenge, please email redkisses121@gmail.com ASAP. 
  • Aug 1: Validation results are at: Results page
  • Jul 28: Paper draft submission deadline is Today (July 28th, 23:59 PT). Please submit by sending E-mail to the organizers. Having a paper draft submitted is a requirement for appearing on the leaderboard.
  • Jul 27:  Today (July 27th, 23:59 PT) is the deadline for submitting the results on validation images. As of now no intermediate results will be e-mailed any more.
    Your last submission will count towards the leaderboard
  • Jul 26:  Remember: Off-site validation set results submission deadline (July 27th, 23:59 PT). Your last submission counts towards the leaderboard. It's up to the participants to decide if you want to resubmit one of the previously evaluated (out of max 4) or try with a new one.
  • Jul 23: You can now test your submission using the code here: https://github.com/ignaciorlando/refuge-evaluation
  • Jul 18: The submission page is online.
  • Jul 16: Baseline results are available here: https://github.com/HzFu/REFUGE_baseline.           
  • Jul 15: Validation images are released.
  • Jul 12: Annotations of three non-glaucoma images (n0058,n0093,n0271) were updated, please download the updated file or the whole package.
  • Jun 30: Disc and cup masks (BMP files) were updated, please refer to Details, the download links were updated accordingly.
  • Jun 26: A training image n0070.jpg was left-right flipped, the download links were updated accordingly.
  • Jun 20: Annotations of the training images are released.
  • Jun 15: Training images with glaucoma labels are released.
  • Jun 15: Website is updated.
  • Jun 15: Redundant/dummy or incomplete registrations were removed, please complete all information required and resend the registration request through EMAIL.

Related RULES are copied as below for your reference.

  • Anonymous registration is NOT allowed. All information entered when registering a team, including the (TRUE) Name of the contact person, the Affiliation (including department, full name of university/institute/company, country) and the E-mail address must be COMPLETE and CORRECT.
  • Incomplete registrations will be removed without notice.
  • Redundant registrations will be removed without notice.

 

How to Participate?

If you want to participate in the challenge, you can register at the website, download the training and test data, submit the results, and attend the OMIA/REFUGE workshop, provided you agree with the rules of the challenge.

Please contact the organizers if you have any questions. 

 

Important Dates

10 June 2018: Registration opens.
15 June 2018:  Training images are released.
20 June 2018:  Annotations (unified fovea location, disc, and cup masks) of the Training set are released.
15 July 2018: Off-site validation set is released.
18 July 2018:

Submissions of results on Off-site validation set are opened.
- A team must be registered through email and got approval before able to
  submit the results to get a score on leaderboard;
- A team can submit up to 4 intermediate results in total and receive the performance feedback through email within two days.
The last (max 5th) submission will count towards the leaderboard 

27 July 2018: Off-site validation set results submission deadline (July 27th, 23:59 PST).
28 July 2018: Paper SUBMISSION (draft manuscript submission which can be updated by Aug 29) deadline.
-- This is to issue the invitation letters to whom needs an invitation for travel approval.
01 Aug 2018: Off-site leaderboard PUBLISHED and the best teams (with paper submitted) INVITED for the on-site REFUGE.
29 Aug 2018: Camera-ready paper SUBMISSION deadline.
20 Sep 2018: On-site REFUGE (1 hour) in conjunction with OMIA workshop at MICCAI 2018.

 

Statistics


Number of users: 322