CVPR 2018 Workshop and Challenge: Automated Analysis of Marine Video for Environmental Monitoring
Submission Process Overview
The submission process is handled on our data challenge site, challege.kitware.com. For a succesful submission, the following steps must be taken:
- For each folder in the challenge data release, you should run your classifier on the imagery and annotation data in that folder. The output of your classifier must be in the COCO results format. The name of your output file MUST be: foldername.mscoco.json, where foldername corresponds to the folder on which your ran your classifier to generate the results. See the File Names section below for a list of the valid filenames one can use for the submission.
- Note that, due to the nature of the challenge data being images only, we’ve released a JSON file that maps filenames to the corresponding image_id. So, the image_ids in your result submissions can be found by this file. Please go here for the download.
- You can submit detections for as few as one or as many as all of the folders contained in the released challenge data. The scorer will only score on the files you submit.
- **NOTE** As of now, we are only accepting submissions for bounding box detections, which excludes submissions for the NWFSC and AFSC datasets, which contain keypoints. We are working on addressing this issue. Your submission file thus must be in the valid results format, with only bounding box detections, otherwise the scorer will fail.
- Once you have generated the files you want to submit, go to the submission site for our challenge on challege.kitware.com.
- On that page, you will see a green button that says “Submit your results”.
- You’ll be prompted to enter a submission title (choose something descriptive that contains your team name).
- You will see another green button that says “Browse or drop files here”. When you click that button a file browser should open up, allowing you to select one or many submission files. Choose the file(s) you would like to submit for scoring. Ensure the files follow the naming requirements described in step 1. or in more detail in File Names below. Click “Choose”.
- Press the “Start Upload” button. You’ll see progress bars showing the upload progress
- Once the submission is uploaded, refresh your browser and you will see a progress wheel spinning that says “Your submission is being scored, please wait…”. The scoring process can take some time, depending on server load and other factors. Feel free to navigate away from that page.
- Once the scoring is completed, you’ll receive an email from email@example.com notifying you as such. Click the link in the email to see your score, which will be broken down by dataset/submission. You can also download your submission to ensure what you expected to be scored was actually scored.
The challenge data was released with a folder for each dataset. The submission format will closely resemble this, but with files. If you were to make a full submission, you would upload files with the names:
afsc.mscoco.json habcam.mscoco.json mbari1.mscoco.json mouss1.mscoco.json mouss2.mscoco.json mouss3.mscoco.json mouss4.mscoco.json mouss5.mscoco.json nwfsc.mscoco.json
If you would like to make a partial submission, you only need to submit files with the above names corresponding to the datasets you would like to be scored on.
The results files submitted must be in the COCO results format. Failure to comply with this standard will cause your submission to fail.
Further, please not that your submission files may (currently) only contain bounding box annotations, not key points. We are working on developing the scoring ability on key points, but for now, the scorer will fail.
Submissions must be made to the challenge site.